Smarter ELNs and Smarter Labs Make for Smarter Science
By Joe Stanganelli
October 31, 2016 | BOSTON—New technology, new laws and regulations, and bigger, more cumbersome "Big Data" have together caused the standard, traditional electronic lab notebook (ELN) to obsolesce. According to a panel of experts speaking in Boston at the Pistoia Alliance's annual US Conference on October 18, new digital-transformation technology—including the Internet of Things and cloud computing—allow for enhanced ELNs that can address today's clinical pain points of efficiency, ROI, scientific contextual consistency, and risk reduction.
The second panel session of the day was titled "The Future of IP Capture"—and, to be sure, intellectual property capture is one of the primary goals of keeping a well-maintained ELN to begin with—but the session delved deeper into what the ELNs of the past, the present, and the future all respectively look like.
"Those systems got very complex," observed Michael Elliott, CEO of Atrium Research & Consulting, in his opening remarks as session chair, discussing the trend of how technological solutions in the lab have so exponentially advanced that the life-sciences sector has struggled to intelligently catch up. "We're at that point where we're saying, ‘Where are we?’"
The Pain Points of Collaboration, Context, and Catching Up to the Data
Margaret DiFilippo, Vice President of Sales in North America for Dotmatics, kicked off the series of presentations with some sobering statistics on life-science externalization.
"Seventy-five percent of biopharmaceutical companies have at least one partner—and most have many, many more," noted DiFilippo in her presentation, The Future of IP Capture in an Increasingly Collaborative World. "Forty percent of pharma's budget is spent outside the company rather than inside. The external investment in industry spending far exceeds in the internal investment that is being made.
On top of these costs (and, likely, substantially contributing to them), DiFilippo noted that the standards of collaboration that have prevailed in the industry have made it "very difficult to track who owns what."
"An estimate of 60%-70% of the collaborative partnerships fail to deliver what they intended to; there's poor communication, unclear roles and responsibility, data… inconsistencies, and cultural differences," lamented DiFilippo. "So the external [cost factors] are changing the way we look at IP."
DiFilippo was not alone in noting that ontology in particular presents problems for collaboration and context.
"We all develop our own different vocabularies around these things," said panelist Dana Vanderwall, Director of Biology & Preclinical IT at Bristol-Myers Squibb, in his following presentation, The ELN: What Problem Are We Trying to Solve (Now)? "So imagine what it's like now to go around and collect all the context in a different study and contextualize it[.]"
Similarly, even internal data is not being used to its fullest advantage.
"We've created so much legacy data out there, and how many experiments have been duplicated because we didn't know the guy down the hall did the same experiment 20 years ago?" said DiFilippo. "The key is being able to get that needle in that haystack, and being able query across all … databases is key."
As things stand now, life-science data have coalesced into Big Data—becoming unwieldy and confusing.
"We're bringing high-content platforms further back in discovery and into the clinic, and we're getting further efficient [while] generating an enormous amount of data, [but] whether or not we're able to keep up with the capture in that process is a whole 'nother story," said Vanderwall. "While we are on the cusp, there are some things that we are getting better at, but at the end of the day we're still learning… [In] some cases we're getting better at that… and then you generate the wrong data and you've got to do some data reduction[.]"
Cloud Platforms and Big Data: Evangelizing the Opportunity
"This is the 'Big Data Problem.' I don't see it as a problem. I see it as an opportunity. We have to think about how this information we're… collecting [and] use [will] actually enable science—enable scientists—to make easier, faster decisions… based on the quality of information they already have," said panelist Joshua Bishop, Associate Director of Business and Technology Analysis for CPH IT Informatics at Merck & Company. "We want to lower that barrier. It can be done with a number of solutions that are already out there."
For Bishop, the solution to this problem-cum-opportunity lies in the same platforms that are collecting all of these data and making them so problematic in the first place—by way of cloud platform-created network effects.
"A data network effect… is when goods and services become better in performance as they get more data—[such as] AWS, Salesforce, [and other cloud] platforms that learn what you're trying to put in, and understand the information that you're trying to put in and [then] feature recommendations," said Bishop (using the classic "If you like [x], then you might also like this" example). "Thinking about them from a program perspective, you have your regular systems, you have your ELN, you have your instruction data-capture tools … [and] each of these individual tools by themselves don't necessarily do anything spectacular—but when you pull that information together … in a way that demonstrates … intelligence … [then] this is where the value of the data network effect really comes into play."
DiFilippo's sentiments were in line with Bishop's, simply because the cloud is the best way to have effective collaboration.
"Some of the character of a collaborative informatics solution that we feel is very necessary is [that] you need to start with secure scientific data storage and exchange," said DiFilippo. "We're seeing now that for the cloud; it's getting to a point where it's about 50-50 as far as in-house [versus] on-premises, but when you're dealing with collaborations throughout the world, cloud becomes even more important—and then you need to convince others[.] As we move forward in the future, you're going to see that more and more companies are going to be adopting cloud technology."
Bishop echoed DiFilippo's call for evangelizing advanced ELN solutions.
"I've had an electronic [lab] notebook my entire career. I've always known software and tools to be useful to scientists, so it's a little bit difficult to me to extract myself to researchers who haven't had these tools," said Bishop. "You need to showcase the utility. You need to showcase what it can do for scientists in their everyday li[ves] and make them understand what that value is and how it makes their science move forward. It's really dependent on the type of science that's being done, the environment that they're in, [and] the ecosystem [that] they are part of."
There is far more to advanced ELN-enabling technology adoption than evangelism, however. The technology has to be able to actually, demonstrably work.
"You [have to] think about putting functionality in there that makes [the technology] more useful to go there [in the lab]. The other side of that is that the better we structure the description of the experiments that are being run, and maybe enable a better capture of the human observations, the more searchable we can make the data—any of the data—and the more context we put around the experiments," advised Vanderwall. "If we can make the instrument smarter [while] things are executed instead of [as] some post-hoc matter … maybe the purpose for the notebook is not to require people to type things that computers can capture, but also [about their] observations about stuff."
To this end, in order to have more advanced ELNs, and in order to record, extract, and use their data more effectively, scientists need more advanced lab equipment—and more advanced labs.
The Smart Lab
In an earlier presentation that day titled The Laboratory of Things, Gene Tetreault, Senior Director of Unified Lab Management of Dassault Systèmes BIOVIA, bubbled with excitement over how "cool" and "applicable" IoT is to today's lab.
"I've been doing this for a long time, and it's just crazy to me that over the years I keep hearing over and over again, 'I don't know what equipment we have; we don't know what the status of that equipment is; we don't know how much it's being used,'" said Tetreault. "IoT is a way of keeping a status report [because] maybe we can keep track of the performance of last week, we can keep track of the usage of equipment, things of that sort—and of course this is all enabling analytics."
Moreover, these tools of digital-transformation that enable data analytics for laboratory management can similarly enable better science and drug discovery.
"The [aspect] you really want to go after is the value of greater insights, and that [is] particularly [true] in the stages of R&D," said Vanderwall. "[This] is kind of tough because our real successes and failures are so far downstream in that transition between Phase 1 and Phase 3 [of clinical trials that] it's pretty hard to map it all the way back to stuff that's being done in discovery, and sometimes even development, … and missing realization at this point. You can start drawing those numbers now [with smarter technology]."
On this point, Tetreault pointed to the laborious nature of manually recording things in a lab notebook—electronic or otherwise—that inhibits the scientist. As such, data- and knowledge-sharing, nonstandard ontologies and formats, and inefficient and/or unintegrated processes are all pain points that IoT and other digital transformation technologies can solve.
"You write down exactly what you need to write down because you really want to focus on the science. With IoT, … we can do some powerful things. We can collect all that data and then do some powerful analytics on the backend," advocated Tetreault. "It's a challenge, right? It's one of those things that's not value-added and it's really tough and time-consuming to do… We can collect data from [lab] equipment automatically [with IoT]."
DiFilippo emphasized the importance of automatic, real-time processes and communication so as to keep IP capture and other ELN processes more efficient—yielding higher-quality data in the long run.
"Real-time communication … is key, and there are many systems that… now… can make that communication available," said DiFilippo. "If you have that ELN system, it's very easy to track who did what when. Full audit trail is very important to being able to track who did the IP."
Augmenting Reality of the Lab
One automated, real-time communication technology that Pistoia Alliance US Conference speakers were particularly keen on was augmented reality (AR)—along the lines of Google Glass.
"[An AR display] gives me the ability to identify that item and immediately tells me what its status is… or tells me whether it's not calibrated, or tells me when it's going to finished, or whatever it might be. There are very powerful things we can do with things like Google Glass or other forms of augmented display," said Tetreault. "[The scientist] can look through the incubator, so now he's looking at the door and he's looking at what's inside of it all through this display."
Tetreault also spoke about how interacting with augmented realities via hand motions, similar to how one plays with an Xbox Kinect or uses a Nintendo Wii controller, could be used to control lab equipment—and how this is more beneficial than actually performing the actions via "actual" reality.
"I don't want to touch...and contaminate things. … By just doing a certain hand motion, I can signal what I want to do and the lab is in operation," said Tetreault. "You can do things where [the ELN is] watching your motions and recording your motions, and it's storing your motions in the database as a scientific action so you don't have to record manually in a notebook—so all of this gives us an ability to capture more data [and] gives [the reviewer] some insight in the overall process."
"Google Glass needs to be in the lab sooner rather than later," said Bishop, tying Tetreault's evangelism into his own overall point. "It's really about data extraction … Thinking about what we need to do next and where we need to be."
DiFilippo was more focused on the IP-capture aspects of ELN data.
"Over 80% of a company's value is attributed to IP. Those assets need to be captured systematically and automatically," DiFilippo pronounced. "They need to be qualified in uniqueness, [they] may reside on one machine [or] may reside on many machines in many different countries,… and it needs to be protected by an audit trail."
Tetreault—for all of his self-confessedly "geek[y]" enthusiasm—was similarly focused on the bottom line, tying the enabling technologies to value.
"We are enabling the scientists to do more science by taking away those tasks that they used to do that would take them much longer. If I have to take a minute to put down my samples [when] that could automatically be documented, then it's a minute less of me doing science," said Tetreault. "[W]e can focus on utilizing the assets we have better, and then use the capital that we have to purchase other assets that might give us additional scientific benefit—instead of just having a waste of extra assets that might be redundant."