•   
  •  
  •  
  • Login
  • |
  • Register


NGS Leaders Blog

Is There a Niche for a Standalone NGS Supercomputer?

 Permanent link

genomicSeptember 28, 2012
Kevin Davies : The launch this week by Knome of the knoSYS 100 "genome supercomputer" has raised a lot of interest in the value and desirability of a dedicated hardware/software solution for NGS data analysis and interpretation.
As first reported at Bio-ITWorld.com, Knome executives clearly feel there is a niche for many users who want a turnkey solution for processing NGS data. I’m sure CEO Martin Tolar doesn’t literally believe that every NGS instrument – about 2,000 are estimated to be running currently around the world – will have a computer next to it, but that doesn’t stop him from raising the possibility.

George Church 

Knome’s Jonas Lee says that the company, co-founded by George Church as the first personal genome sequencing firm, remains staunchly a software company. "But most institutions have trouble putting together the precise [IT] system to run the software, so why not take that burden off their hands? Not everyone has a really deep IT bench - so we did it for them."

The solution requires some forethought: the knoSYS unit weighs in at almost 600 pounds, and at $125,000, the price is higher than some benchtop sequencers. It contains four 2.4-GHz 8-core Intel Xeon E5-2665 processors, 64 Gigabytes of DDR3 memory, and 18 to 54 Terabytes useable disk storage. Running kGAP software, Knome executives are excited about the ability to create in silico gene "superpanels" for targeted analysis of subsets of a human genome (or exome) sequence, such as genes related to cardiomyoptathy or epilepsy.

But the system’s specs met with some mixed reviews, at least according to two experts interviewed in the Bio-IT World story. The decision to house the computer in a specially soundproofed unit that is wider than a standard data center rack wouldn’t appeal to some research IT folks, said the BioTeam’s Chris Dagdigian, suggesting that the unit really is intended to sit in the lab close to the NGS instruments.

Another NGS expert, HudsonAlpha’s Shawn Levy, felt that the quantity of storage was sub-optimal, and also felt that concerns about cloud security of patient data would be.

Leonid Kruglyak, a geneticist a Princeton University, reacted on Twitter: "Don’t all rush…"

Genomics machinary  

While dedicated software/hardware offerings are rare, there are precedents. A few years ago, the Danish bioinformatics company CLC bio launched the CLC Genomics Machine, "a turnkey solution for large full genome analysis".

CLC bio’s VP communications, Lasse Görlitz, tells me the company is preparing to launch an upgraded version of the hardware, which will include two 8-core Intel Xeon E5-2650 CPUs; 64 Gigabytes RAM, 24 Terabytes storage, and a 64-bit Linux. That’s reasonably comparable with the specs on the new knoSYS, even though the machines have different goals.

Görlitz points out that the CLC Genomics Machine, which weighs about 70 pounds, fits perfectly in a rack or as a stand-alone device (it’s up to the customer). And he candidly states: "We're obviously selling more Genomics Workbench software licenses than these machines, but we find it's a rather popular item with labs and smaller companies who don't already have a huge IT infrastructure in place."

So is there is a true market for turnkey NGS data analysis? Do these solutions suit your group’s needs? If not, why not? We’d like to hear from you…

Collaboration in Research: Can Sharing Data Increase Industry Efficiency?

 Permanent link

Dawn Van DamSeptember 18, 2012 

Dawn Van Dam, General Manager, Cambridge Healthtech Associates :  After interviewing numerous professionals in the life science industry over the past six months, I've seen a common theme emerge: in an ideal world, sharing data with other companies and institutions would allow the development of better technologies and faster solutions. Words like "synergy" and "ecosystem" were used by interviewees to paint a picture of an idealized healthcare landscape in which breakthroughs are accomplished through collaborative efforts - where researchers share both their successes and failures.

In reality, however, the industry provides scientists with incentives that understandably keep them from sharing their findings with other industry leaders. Pharma researchers must protect their IP in order to preserve company integrity and maintain job security, and academics need to keep their findings quiet as they vie for the chance to publish. Therefore redundancies are common in research and those coveted breakthrough results don’t come along as often as they could!

It seems that professionals across the healthcare spectrum - scientists to CEOs - believe that collaboration in research through sharing data and findings is the solution to translate research to the clinic faster. Could increased collaboration in research really help push the industry forward, or is it simply too idealistic for a cut-throat, business-oriented world?

Many of the industry professionals who support collaboration in research suggest creating an incentive system that encourages data-sharing. One, a CTO and Head of R&D for a small organization focused in bioinformatics, believes strongly in combining efforts for more efficient research. In an interview for our study, he highlighted the cost/time benefits of collaboration: "I have an impression that many people do the same thing in a slightly different way, and each one with different funding in a different place in the world. So, I have the impression that a lot of energy is actually redundant." Sharing data can ensure that those vital funds aren’t being used up by redundant experiments and, instead, are put toward achieving new breakthroughs.

Sharing data has benefits for research, but could also have serious implications for the business side of the industry. While redundant research depletes resources throughout the industry, competition within the industry forces researchers to work harder and faster towards making breakthroughs. If collaboration in research and data sharing eliminated the threat of a competitor making an important discovery first, would scientists work more slowly toward scientific discoveries?

Perhaps one solution to increase industry efficiency is to develop public-private partnerships, as a CEO and founder of a small European bioinformatics company described. "The [private enterprises] find it difficult to access these data and make use of them to ultimately push forward and develop the best drugs. I think that when you read about good public-private partnerships, where there are these collaborative efforts between the academic world and the private world, that’s when it becomes most effective."

Industry and academic partnerships allow for data sharing and collaboration in research on a smaller and more manageable scale, while maintaining that competitive edge necessary to drive industry forward at a fast pace. With shared resources and combined funding sources, these partnerships can make greater strides in research and avoid repeating the same mistakes as others in the industry, or drawing redundant conclusions.

A contained and organized system, like the public-private partnership, or a consortium, is a small-scale initiative that can encourage collaboration in research and could ultimately help the entire industry function more efficiently.


We would like to acknowledge and thank all of those who provided their thoughts for this six-month study.

If you would like more information about how Cambridge Healthtech Associates™ help you form, implement or manage a collaborative project in life sciences research and development, please contact me. We have completed many collaborative projects over the last seven years; consequently, our expertise and experience are unparalleled in the industry.
 

CAP and the Clinical Application of Genetics

 Permanent link

Editor’s Note: We are pleased to share an article submitted by Nazneen Aziz. Nazneen is the Director, Molecular Medicine, in the Transformation Program Office at the College of American Pathologists. Email: Naziz@cap.org 


azizSeptember 6, 2012 :
Nazneen Aziz : 
This is an interesting time to be a geneticist. With the whirlwind of recent activities ongoing in the adoption of next generation sequencing (NGS) in diagnostic medicine, I am glad I am one. I never regretted choosing to be a molecular geneticist, but the role genetics would play in medicine was not always clear. In the 1980s, when I was a graduate student at MIT, working on translational regulation of ferritin mRNAs, there was tremendous optimism for what molecular genetics could do for medicine – an almost irrational hope that disease-causative genes would be discovered and lead to gene therapy and cure.

Later, while a postdoc at the Whitehead Institute, I felt the whirlwind of excitement at the beginnings of the Human Genome Project and the imminent discovery of 20,000 potential drug targets. The early 2000s saw the birth of many biotechnology companies focused on genetics research and genotyping technologies, only for the unfortunate, perhaps inevitable collapse of many of these companies around 2003-2005.

This was a few years after I had left the Harvard Medical School faculty to join the biotechnology industry. This was not a rosy time to be a geneticist doing human genetics applied research in the biotechnology sector. Geneticists like me began to wonder if we would ever see the true potential of human genetics being applied in clinical medicine in our careers - or in our lifetime.

We certainly could not have imagined that, just six years later in 2012, we would see the entire genome of patients being sequenced to diagnose disease! Today, genetics is truly being applied in real time to diagnose otherwise inexplicable diseases. Genomic level sequencing is successfully being applied to end the multitude of diagnostic tests that patients had to endure earlier without revealing the underlying cause of their disease.

Ironically, genomic level sequencing as a clinical test could not have happened were it not for the discovery of NGS technologies – a technology that matured around 2005, just as many pioneering genomics firms were collapsing.

Rapid Adoption  

No other technology has seen clinical adoption so rapidly. Diagnostic labs are either already offering or gearing up to offer clinical tests that consist of gene panels, exome or genome. All of this is possible because NGS has dropped the costs of clinical sequencing dramatically, even beating Moore’s Law. Another important factor is also the availability of table-top and less costly sequencing machines.

With clinical large-scale genomic sequencing comes a multitude of questions and needs:
· What standards will the diagnostic labs apply for these highly complex tests?
· Will insurance companies pay for the tests?
· How do we get patient consent for this complex test that can reveal so much more than the specific condition the patient is tested for?
· How do clinicians report incidental findings (significant but unexpected findings)?
· How do clinicians and laboratory technicians interpret new variants that appear to be pathogenic but have never been reported before?
· And what databases do we refer to in order to create the test reports?

At the College of American Pathologists (CAP), which has a long history of developing gold standards for clinical laboratories and is an accrediting body for CLIA, many of these questions are being considered, debated and discussed vigorously. I lead a new committee formed at CAP called the Next Generation Sequencing Work Group (NGS WG), which consists of about 14 member pathologists, CAP staff and representative members from the Association of Molecular Pathologists and American College of Medical Genetics. CAP thought this rapidly developing technology area to be important enough to convene this group and dive into many of these issues, which have no easy answers, but clearly need solutions developed before next generation sequencing becomes a household name.

As a first step, the NGS WG at CAP developed the first set of standards for clinical laboratories for clinical tests using NGS technologies. This checklist for laboratory standards was recently published and is now available to all. However, the work for the NGS WG is not done, as the checklist needs to be expanded and refined to address areas such as cancer and infectious disease. We also hope to explore the development of proficiency test products that assess an operator’s ability to accurately detect and annotate variants.

The excitement is not just present at CAP; several governmental and professional societies are also developing their own guidelines on the multifaceted needs of NGS tests, which involve ethics, interpretation, reporting, consenting, and billing. This flurry of activity and urgency to deal with the practical aspects of this new test is occurring because, for the first time, the true potential of the clinical applications of genetics has been unleashed.

CAP is excited to be a partner with the many organizations driving the future of genomic medicine, and we welcome any ideas, questions, or issues you may have.

OCT
10
I-Study: Genomic Interpretation - Who Will Pay?
During this webinar, members of the study review team present preliminary findings of the I-Study, conducted at the Harvard Medical School's 2011 Personalized Medicine Conference.
Twitter Feed
Privacy Policy|Terms of Use