Smart Data Management in the Race toward Effective Vaccines

January 20, 2021

Contributed Commentary by Roman Vincent

January 20, 2021 | As vaccines for the novel coronavirus are starting to be rolled out across the world, all eyes are on the scientific community. Demand for novel therapeutics is higher today than ever before and will continue to be in the future. Whether you’re a contract development and manufacturing organization (CDMO) or a vaccine manufacturer, smart data strategy is critical—not just for the company, but for all the patients who may be helped by a potential therapeutic. After all, not having a strategy may block a life-saving therapy from entering the market.

Manufacturers have the increasingly important task of considering how various departments capture, manage, and store their data. Streamlining data management practices has numerous benefits, from ensuring quality and safety to keeping costs down and reducing time to market.  

I would argue that the ongoing pandemic has presented an unexpected but auspicious opportunity for companies to take a step back, re-evaluate priorities, and take advantage of the downtime to learn about new technologies. Social distancing and stay-at-home orders are still in place in many areas, which reduces on-site researchers and productivity at the bench—this leads to additional data management challenges and boosts the impetus for modernizing.

Traditional data capture can jeopardize product safety

Pen and paper were the go-to tools for recording data for generations and worked well enough for small volumes. But with pressure to produce novel products faster and streams of data coming from multiple directions, they are obviously inadequate today. Relying on Excel spreadsheets and data transfer between multiple systems also leaves room for error. A Wall Street trader writing a ‘b’ for billion instead of ‘m’ for million created the biggest midday stock price drop in history; this kind of blunder would have catastrophic consequences in biologics. Biopharmaceuticals are fragile—a slight variation in environmental conditions, batch materials, or equipment can render a drug wholly unsafe for public use.

Poor data management will cost you more in the long-run

Tiny errors also have the distinction of creating serious ripple effects. The famous 1-10-100 rule outlined by George Labovitz and Yu Sang Chang highlights the amplification: it costs $1 to prevent an error, $10 to fix the issue, and $100 for each piece of data left uncorrected.

Since data are used by other departments, the longer an error goes undetected, the worse its consequences. Detecting and reporting the error can take two weeks, and once the quality team gets involved, two months to correct it and prevent another occurrence. Time, money, and market access are at stake.

Finally, storing data in different locations can make it inaccessible: for instance, if an upstream team can’t see and share product attributes, the formulations team can’t identify adverse trends and prevent batch failures. The net cost can be massive: 15 to 25 percent of a company’s operating budget in delays and re-work.

Use downtime to revamp data strategies

During the pandemic, labs may have the opportunity to carefully consider what they’d like to get from their data management systems and upgrade to suit long-term needs. Scientists working to complete experiments and analyze data need access to all their data, to form an accurate and complete picture, draw insight, and make necessary decisions.

A small team of scientists may spend 2,000 hours a year checking data, but digitizing management now can help organizations hit the ground running once work goes back to normal. Shifting to digital management can also help maintain communication between those working in the lab and those working from home. But the greatest benefit will be in the long-term, shortening overall timeframes and reducing workload.

Cloud software ensures integrity and compliance

Moving data management to the cloud removes many of these pain points—and data integrity is perhaps the most central. When all data are captured automatically and in a single location, the process is streamlined, human error reduced, and data integrity safeguarded. Scientists can easily access information when they need it, and with validation checks they can be sure the data is accurate and complete.

This is especially useful when demonstrating data integrity to regulators and meeting GxP compliance demands, since everything needed to compile a context-rich report quickly is right there. An organization spending 8,000 hours a year on searching and reporting alone can spend 4,000 hours a year after transferring to a digital platform.

Bolstering communication with teams and regulatory agencies

An end-to-end scientific informatics platform enables all teams to fully access information and make informed decisions. Building in GxP-compliant processes accounts for every variation during manufacturing and ensures quality and consistency through every stage, not just at the end.

Cloud software can also provide a faster mechanism for regulatory bodies to provide scientific advice to R&D organizations—the European Medicines Agency (EMA) is down to just 20 days. It also allows data access for the rolling reviews that enable drug candidates to be fast-tracked, something sorely needed during a pandemic. And for breakthrough approvals, the granular security model allows regulators to see and audit only the relevant data as fast as the lab produces it. Assessing data in real time is something regulators such as the EMA and the Food and Drug Administration now expect.

The bottom line: Superior accuracy and insight

There are really no downsides to automating data management: it increases speed and accuracy and enables manufactures to adjust to data as needed. The focus shifts from capturing data to analyzing it—and drawing insight. It boosts information sharing, highlights complex trends, identifies variations before they become costly problems, and safeguards data integrity. Perhaps most importantly, data management can help biopharmaceutical manufacturers get their therapies and vaccines to market faster, which, now more than ever, is exactly what patients across the world need most.


Roman Vincent is Director of Strategy and Innovation at IDBS. With a technical background in Molecular Biology, Roman has more than 17 years of experience addressing market challenges in the life sciences industry. Convinced that the industry will have the greatest opportunities around digital transformation, he joined IDBS in 2019 in order to help develop new out of the box solutions and address the need to develop drugs faster and more safely while shortening the time to market. He can be reached at