Install this theme
ucsdhealthsciences:

Biomedical science by the (very big) numbers
Almost daily, it seems that health science researchers in San Diego, the United States and around the world report or announce a novel technology, process, approach, means or method for measuring, mapping and monitoring human health.            The goal, of course, is to translate this ever-growing amount of newly derived health data into tangible treatments and cures, a feat that increasingly appears as overwhelming as the amount of data.
That point was made abundantly clear recently when the National Science Foundation announced solicitations for its “Big Data” project, a joint effort with the National Institutes of Health to develop and advance the scientific and technological means of managing, extracting, analyzing and visualizing huge, diverse, distributed and heterogeneous data sets.             In other words, making sense and use of raw information for real-world ills.            It’s a massive undertaking, says Lucila Ohno-Machado, MD, PhD, the founding chief of UC San Diego’s Division of Biomedical Informatics and a key player in this type of effort. Speaking about the White House event that announced the project, Ohno-Machado said:
“Every day Americans go to clinics and hospitals and their data are recorded for treatment, but are not consistently used to make new discoveries or improve healthcare. By developing a means of using these data without compromising patient privacy, we may accelerate science and improve care for all.”
There are a number of obstacles to overcome, says Wendy Chapman, PhD, associate professor at the Division of Biomedical Informatics and director of dissemination for iDASH, an acronym for integrating Data for Analysis, Anonymization and Sharing, the latest additon to the National Center for Biomedical Computing funded by the NIH. iDASH’s explicit goal is to focus on ways of advancing methods for anonymizing and sharing research data. It is based on a platform that matches high security and privacy levels with massive, scalable storage and calculating power.
For example:
Today, most data (which is mostly non-clinical) resides in repositories governed by limited data use agreements among the biomedical researchers and institutions most likely to find benefit in the material. In the future, these databases will need to be annotated  and managed using reliable patient informed consent systems and a certified trust network developed as part of iDASH and SCANNER, which stands for Scalable National Network for Effectiveness Research, a project funded by the Agency for Healthcare Research on Quality. There will be incentives for securely sharing information so that effective science can be done at its fullest and fastest.
Today, the system is indisputably incomplete. Computer scientists are looking for data; biomedical and behavioral scientists are looking for analytics. The amount of data stored out there is huge, but the high-performance computing required to crunch the numbers effectively is limited to just a few institutions.
One of those institutions is UC San Diego, which is home to both the San Diego Supercomputer Center and the California Institute for Telecommunications and Information Technology, or Calit2.

ucsdhealthsciences:

Biomedical science by the (very big) numbers

Almost daily, it seems that health science researchers in San Diego, the United States and around the world report or announce a novel technology, process, approach, means or method for measuring, mapping and monitoring human health.
           
The goal, of course, is to translate this ever-growing amount of newly derived health data into tangible treatments and cures, a feat that increasingly appears as overwhelming as the amount of data.

That point was made abundantly clear recently when the National Science Foundation announced solicitations for its “Big Data” project, a joint effort with the National Institutes of Health to develop and advance the scientific and technological means of managing, extracting, analyzing and visualizing huge, diverse, distributed and heterogeneous data sets.
           
In other words, making sense and use of raw information for real-world ills.
           
It’s a massive undertaking, says Lucila Ohno-Machado, MD, PhD, the founding chief of UC San Diego’s Division of Biomedical Informatics and a key player in this type of effort. Speaking about the White House event that announced the project, Ohno-Machado said:

“Every day Americans go to clinics and hospitals and their data are recorded for treatment, but are not consistently used to make new discoveries or improve healthcare. By developing a means of using these data without compromising patient privacy, we may accelerate science and improve care for all.”

There are a number of obstacles to overcome, says Wendy Chapman, PhD, associate professor at the Division of Biomedical Informatics and director of dissemination for iDASH, an acronym for integrating Data for Analysis, Anonymization and Sharing, the latest additon to the National Center for Biomedical Computing funded by the NIH. iDASH’s explicit goal is to focus on ways of advancing methods for anonymizing and sharing research data. It is based on a platform that matches high security and privacy levels with massive, scalable storage and calculating power.

For example:

Today, most data (which is mostly non-clinical) resides in repositories governed by limited data use agreements among the biomedical researchers and institutions most likely to find benefit in the material. In the future, these databases will need to be annotated  and managed using reliable patient informed consent systems and a certified trust network developed as part of iDASH and SCANNER, which stands for Scalable National Network for Effectiveness Research, a project funded by the Agency for Healthcare Research on Quality. There will be incentives for securely sharing information so that effective science can be done at its fullest and fastest.

Today, the system is indisputably incomplete. Computer scientists are looking for data; biomedical and behavioral scientists are looking for analytics. The amount of data stored out there is huge, but the high-performance computing required to crunch the numbers effectively is limited to just a few institutions.

One of those institutions is UC San Diego, which is home to both the San Diego Supercomputer Center and the California Institute for Telecommunications and Information Technology, or Calit2.

ucsdhealthsciences:

How Genes Organize the Surface of the Brain
The first atlas of the surface of the human brain based upon genetic information has been produced by a national team of scientists, led by researchers at the University of California, San Diego School of Medicine and the VA San Diego Healthcare System. The work is published in the March 30 issue of the journal Science.
The atlas reveals that the cerebral cortex – the sheet of neural tissue enveloping the brain – is roughly divided into genetic divisions that differ from other brain maps based on physiology or function. The genetic atlas provides scientists with a new tool for studying and explaining how the brain works, particularly the involvement of genes.
“Genetics are important to understanding all kinds of biological phenomena,” said William S. Kremen, PhD, professor of psychiatry at the UC San Diego School of Medicine and co-senior author with Anders M. Dale, PhD, professor of radiology, neurosciences, and psychiatry, also at the UC San Diego School of Medicine.
According to Chi-Hua Chen, PhD, first author and a postdoctoral fellow in the UC San Diego Department of Psychiatry, “If we can understand the genetic underpinnings of the brain, we can get a better idea of how it develops and works, information we can then use to ultimately improve treatments for diseases and disorders.”
The human cerebral cortex, characterized by distinctive twisting folds and fissures called sulci, is just 0.08 to 0.16 inches thick, but contains multiple layers of interconnected neurons with key roles in memory, attention, language, cognition and consciousness.
Other atlases have mapped the brain by cytoarchitecture – differences in tissues or function. The new map is based entirely upon genetic information derived from magnetic resonance imaging (MRI) of 406 adult twins participating in the Vietnam Era Twin Registry (VETSA), an ongoing longitudinal study of cognitive aging supported in part by grants from the National Institutes of Health (NIH). It follows a related study published last year by Kremen, Dale and colleagues that affirmed the human cortical regionalization is similar to and consistent with patterns found in other mammals, evidence of a common conservation mechanism in evolution. 
More here

I wonder if other organs can be atlas-ed like this.

ucsdhealthsciences:

How Genes Organize the Surface of the Brain

The first atlas of the surface of the human brain based upon genetic information has been produced by a national team of scientists, led by researchers at the University of California, San Diego School of Medicine and the VA San Diego Healthcare System. The work is published in the March 30 issue of the journal Science.

The atlas reveals that the cerebral cortex – the sheet of neural tissue enveloping the brain – is roughly divided into genetic divisions that differ from other brain maps based on physiology or function. The genetic atlas provides scientists with a new tool for studying and explaining how the brain works, particularly the involvement of genes.

“Genetics are important to understanding all kinds of biological phenomena,” said William S. Kremen, PhD, professor of psychiatry at the UC San Diego School of Medicine and co-senior author with Anders M. Dale, PhD, professor of radiology, neurosciences, and psychiatry, also at the UC San Diego School of Medicine.

According to Chi-Hua Chen, PhD, first author and a postdoctoral fellow in the UC San Diego Department of Psychiatry, “If we can understand the genetic underpinnings of the brain, we can get a better idea of how it develops and works, information we can then use to ultimately improve treatments for diseases and disorders.”

The human cerebral cortex, characterized by distinctive twisting folds and fissures called sulci, is just 0.08 to 0.16 inches thick, but contains multiple layers of interconnected neurons with key roles in memory, attention, language, cognition and consciousness.

Other atlases have mapped the brain by cytoarchitecture – differences in tissues or function. The new map is based entirely upon genetic information derived from magnetic resonance imaging (MRI) of 406 adult twins participating in the Vietnam Era Twin Registry (VETSA), an ongoing longitudinal study of cognitive aging supported in part by grants from the National Institutes of Health (NIH). It follows a related study published last year by Kremen, Dale and colleagues that affirmed the human cortical regionalization is similar to and consistent with patterns found in other mammals, evidence of a common conservation mechanism in evolution. 

More here

I wonder if other organs can be atlas-ed like this.

Science writing, technical writing, technological writing- what’s the difference?  Whereas technical writing is a sort of bare exposition, science writing is like an elucidation of technical and scientific advancements, and technological writing is the reporting of advancements in the tools that enhance our connectivity to the world- whether it be the result of computer science or the pure sciences.  Technical writers explain the mechanisms behind the advancements.  Science writers aim to educate the general public about current events in science advancements and interpret its implications on the humanities.  Technological writers may cover an array of technology, from smartphones, to social media, to the latest Apple news.  Technical writing can be identified by its astute characterization of explanations.  In contrast, science and technological writings are exemplified by crafted sentences embracing science and technology as an art. (I really appreciate Google’s second definition of art- “Works produced by such skill and imagination.” Science and technology are a result of skill and imagination applied to the natural world and artificial intelligence.) 
The focus of this course is science writing.  Before we can become science writers, we must identify what makes a good science writer.  As an aspiring scientist, I find science exciting.  A good science writer should be able to convey the excitement to the general public.  Science writing should not be too weighed down by technical terms.  It should interpret the impact and potential of a scientific event.  It should simplify the science in such a way that once you have read it, you wish that you had been clever enough to reduce the complexity from something overwhelming to something easy to understand.  And of course, good science writing should have the elements of good writing in general- fluent sentence structure with organized and communicative thought.
Before my search of good science writing, I thought this genre was limited to news stories.  I rely on various websites and tumblr blogs for my science news.  UCSD is one of my favorite.  I like the way the news article “A New Approach to Faster Anticancer Drug Discovery" identifies the shift of past drug discovery methods to the promising potential of new approaches that utilize identification of genetic pathways.  I am amazed by the role of bioinformatics and genomics in pharmaceutical development, and this article concisely describes its potential.  Once my formal search of good science writing began, I browsed the New York Times archive, and found a tale of another switch in methods in the science world. In "Cancer’s Secrets Come Into Sharper Focus,”  George Johnson describes scientists’ growing recognition of the significance of non-coding DNA (or as it used to be called, junk DNA).  Johnson succeeded in making the history of cancer research interesting.
I also browsed through “The Best American Science Writing: 2006”.  Having thought science writing was limited to news stories, I was surprised to see personal anecdotes included in the anthology.  For example, one man wrote of his quest to once again hear his favorite classical song in “My Bionic Quest for Bolero.”  He first heard it as a teen in the 1970s.  He lost his hearing completely in 2001, and decided to get a cochlear implant, which was optimized to hear speech, not music.  With the help of audiologists and software engineers, the software in his cochlear implant was reprogrammed to help him hear music.  It was a touching personal story that blended science and technology writing.

Science writing, technical writing, technological writing- what’s the difference?  Whereas technical writing is a sort of bare exposition, science writing is like an elucidation of technical and scientific advancements, and technological writing is the reporting of advancements in the tools that enhance our connectivity to the world- whether it be the result of computer science or the pure sciences.  Technical writers explain the mechanisms behind the advancements.  Science writers aim to educate the general public about current events in science advancements and interpret its implications on the humanities.  Technological writers may cover an array of technology, from smartphones, to social media, to the latest Apple news.  Technical writing can be identified by its astute characterization of explanations.  In contrast, science and technological writings are exemplified by crafted sentences embracing science and technology as an art. (I really appreciate Google’s second definition of art- “Works produced by such skill and imagination.” Science and technology are a result of skill and imagination applied to the natural world and artificial intelligence.) 

The focus of this course is science writing.  Before we can become science writers, we must identify what makes a good science writer.  As an aspiring scientist, I find science exciting.  A good science writer should be able to convey the excitement to the general public.  Science writing should not be too weighed down by technical terms.  It should interpret the impact and potential of a scientific event.  It should simplify the science in such a way that once you have read it, you wish that you had been clever enough to reduce the complexity from something overwhelming to something easy to understand.  And of course, good science writing should have the elements of good writing in general- fluent sentence structure with organized and communicative thought.

Before my search of good science writing, I thought this genre was limited to news stories.  I rely on various websites and tumblr blogs for my science news.  UCSD is one of my favorite.  I like the way the news article “A New Approach to Faster Anticancer Drug Discovery" identifies the shift of past drug discovery methods to the promising potential of new approaches that utilize identification of genetic pathways.  I am amazed by the role of bioinformatics and genomics in pharmaceutical development, and this article concisely describes its potential.  Once my formal search of good science writing began, I browsed the New York Times archive, and found a tale of another switch in methods in the science world. In "Cancer’s Secrets Come Into Sharper Focus,”  George Johnson describes scientists’ growing recognition of the significance of non-coding DNA (or as it used to be called, junk DNA).  Johnson succeeded in making the history of cancer research interesting.

I also browsed through “The Best American Science Writing: 2006”.  Having thought science writing was limited to news stories, I was surprised to see personal anecdotes included in the anthology.  For example, one man wrote of his quest to once again hear his favorite classical song in “My Bionic Quest for Bolero.”  He first heard it as a teen in the 1970s.  He lost his hearing completely in 2001, and decided to get a cochlear implant, which was optimized to hear speech, not music.  With the help of audiologists and software engineers, the software in his cochlear implant was reprogrammed to help him hear music.  It was a touching personal story that blended science and technology writing.