The Next-generation Screening Lab What, Why, How?

March 16, 2018

Contributed Commentary By Dr. Patrick Courtney and Kevin Teburi 

March 16, 2018 | As part of the ELRIG Drug Discovery conference in Liverpool (UK) in October 2017, a group of 60 delegates came together to discuss the lab of the future in a workshop organised by the Standards in Laboratory Automation (SiLA) consortium. The workshop focused on four themes: users, suppliers, interoperability, and news of German SmartLab initiatives. To engage with considerable audience experience, speakers presented their views and opened the floor to discussion. We took notes.

A User View: Building The Next-Generation Lab Designed For Collaboration

Mark Wigglesworth from AstraZeneca and Andrew Mitchell of Unilever presented the construction of a major new laboratory facility for two different industries: pharmaceuticals and materials science. Science is on-display in both new labs engineered to enable innovation. Empowering collaboration through next-generation lab design is a major change from the traditional bench setting and increasingly important to scientific and commercial success.  The shift from individual experiments to shared models of discovery can be challenging. Existing ELNs are often only “paper on glass” and fail to capitalize on data now accessible via digitization.

Simplification of workflows across the setup-run-capture-analyse discovery cycle was also discussed.  Instruments must be ready-to-run – from automated preparation to automated clean-up. This can be especially challenging with shared equipment and inexperienced users. To overcome this, access to the right IT support is critical for success. Troublesome PCs with security upgrades and shared workspaces are being replaced by standard tablets and online services to smooth this transition. Experiment, equipment, and building monitoring provides an additional layer of information and experimental metadata which in turn should result in reducing inter-operator and inter-site variability.  It also enables customizable dashboards accessible using mobile apps adding extra-fine control to conducting science. Collaboration at a distance and protocol transference are aided by real-time video and audio capture for virtual presence, available just in the past 10 years. Looking beyond the smartphone era, collaboration can be further enhanced by AR/VR/wearable technologies.

What Is Industry 4.0 And How Will It Impact The Lab?

Rob Harkness of PAA and Paul Kendall of Festo linked the B2C world of smartphones and social media with the B2B world of the digital factory and collaborative robotics.  Key themes and tools becoming widely used in the lab of the future include internet of things (IoT) and cloud computing under the banner Industry 4.0. While not a new concept, Industry 4.0 is a collective label for connecting various techniques and technologies around a common theme:  extract more from the interconnected lab environment. These technologies eliminate non-value add lab activities enabling bench scientists to apply their intellectual skills where they will be of most value: inside the lab to identify and resolve failures, and outside the lab to make the mental leaps needed for true innovation.

Overcoming The Interoperability Challenge

While individual instruments can work well in the lab, inter-operability poses a challenge. Membership organisations such as the Pistoia Alliance, SiLA, and AnlML are working together and with stakeholders on areas of mutual interest including the laboratory-of-the-future.

The Standards in Lab Automation Consortium, SiLA, has focused on in-lab command and control and has had considerable success helping major end users build large integrated systems, mainly in Europe. In each case vendors adapted the interfaces to their products and these product variants have been made available with some 60 devices listed on the SiLA website. As technology has evolved, with improved protocols, increased use of wireless and concepts such as IoT, a new version called SiLA2 has been developed that will be fully compatible with the most modern technologies.

On the data side, the Analytical Information Markup Language (AnlML) captures analytical data, working with ASTM and IUPAC as established and experienced standardization bodies. AnlML and SiLA enable the transition from file-based exchange to true communication using XML-based human-readable but efficient and compatible formats with a range of benefits. Many instances and success stories were described in analytical chemistry (chromatographic and spectroscopic methods), biologics, and standard lab devices such as balances, pH meters etc.

Using Lab Data To Train AI Engines

The workshop closed with a set of three fascinating presentations: Mario Bott from the Fraunhofer Institute and their niCLAS as an academy, reference lab, and future lab rolled into one; Markus Sebeck presented the SmartLab cooperation network in Northern Germany; and Stephan Heyse from Genedata spoke on the evolution in screening from a linear process to a more iterative Design-Make-Test-Analyse cycle, which uses AI techniques to deliver better results more quickly.

The intelligent use of experimental data and results, integrated across multiple labs, will help define the next-generation automated experiments.  These experiments, at a scale impossible to perform manually, will accelerate research and explore areas of science currently inaccessible without the technologies of the fourth industrial revolution. The future lab will digitize all relevant information and train AI engines at a scale that supports the scientist as the data curator and provides suggestions on new areas of investigation. Recent advances in this field show great potential but require significant amounts of labelled data to train models that are both specific and general. Therefore, the scientist must act as data curator and process controller with expertise in both the domain and the technology.

While the user presentations focused on major new investments in large scale facilities, some asked when this may trickle-down to smaller labs. The costs have been coming down, often dramatically, and some technology is starting to appear in related applications—all of which should aid adoption. The presentations by Fraunhofer-IPA and the SmartLab project proposed more affordable and incremental technologies less demanding of infrastructures. Creating a balance between consumer technology adoption in lab settings vs. producing technology specifically designed for science is a thorny issue still being sorted out.

The Lab Of The Future Will Need The Scientist Of The Future

Barriers to next-generation technology adoption include: reluctance to go first and risk breaking processes that work; fear of vendor lock-in; the industry’s fragmented structure; lack of training for young researchers; and a common scepticism of effort now for ‘jam tomorrow’. Demonstration of utility and ROI is missing but difficulty sharing data even within organizations and across established collaborations is frustrating the good work. One interesting observation concerned the age of current decision makers, who learned their trade in a more traditional laboratory environment and thus may be hampering adoption due to a fear of change. Perhaps the next generation will have different expectations and the lab of the future will require the scientist of the future!

Regardless of the industry or academic institution, all scientists are focused on performing the best science. As one audience member stated, “We do science, not technology. We use technology to get the job done. Research is constantly evolving, almost by definition. Static processes tied to physical locations cannot keep pace, and ability to remain agile is needed.”

 

Dr. Patrick Courtney is a member of the Board of Directors of the Standards in Laboratory Automation, an Advisory Board Member of the ADAlab, and is a leading voice within the euRobotics and European Commission. He can be reached at pikcourtney@yahoo.co.uk. Kevin Teburi is managing director of Genedata UK. Prior to Genedata, he worked for many years at AstraZeneca where he was director of Informatics and Science Information & Technology. He can be reached at kevin.teburi@genedata.com. Mario Bott from Fraunhofer-IPA and Gerhard Noelken of Pistoia Alliance also contributed to this report.