DESKTOP ENGINEERING WITH IBM:
High Performance Cluster Computing Survey
An executive summary of the results of a survey conducted by Desktop Engineering to gauge its audience’s familiarity with high performance cluster computing and its benefits.
Target Analytics Maturity to Beat the Big Data Backlash
In recent years, the media has been full of Big Data; talking either about its amazing potential or the privacy implications. With multiple vendors desperate to establish primacy, Big Data has been everywhere.
However, outside of established public-facing poster boys like Google, Amazon and Facebook, tangible progress has been patchy. It’s all been a bit too hard, fighting against immature technology and a lack of available expertise. Quick wins have been few and far between. Reality has not lived up to expectation and Big Data is now sliding down the steep trough of the Technology Hype Cycle, and experiencing a backlash.
Why Monitoring Is More Than Just SDV
A successful risk-based monitoring program entails more than just better site monitoring and reduced source document verification (SDV); it also requires building quality into the protocol from the start and redefining what data quality means. This paper explains how Medidata’s cloud-based tools support the industry-pioneering monitoring programs in-line with TransCelerate’s five recommended tactics: Building Quality by Design (QbD) into trials, assessing risk early and ongoing, focusing on Critical Processes and Critical Data, Using Risk Indicators and Thresholds, and making adjustments to monitoring activities.
Taking NAS to the Cloud
Leveraging Cloud Storage for Data Deluge in the Sciences
The data deluge facing life science is causing many to look at new options for data storage. This unprecedented rate of growth is straining IT budgets and has many looking for answers in cloud-based services. This paper takes a look at data center technologies designed to help to easily integrate cost-effective cloud storage options into an existing network-attached storage system. Looking at capacity, performance and total cost of ownership, the author makes a case for cloud integration becoming a viable, preferred option to keeping pace using a traditional storage system approach.
Running Smarter Trials with Data-Driven Monitoring
Clinical monitoring remains one of the most important and most costly activities in the clinical research paradigm. Unlike many clinical trial activities, which have been steadily transformed by technology, the monitoring function itself has changed little.
This white paper examines the core components, prospective benefits, and broad principles for the adoption of a data-driven monitoring solution and how it can substantially improve study quality, safety, and also reduce monitoring costs, which today account for 15-30% of total study costs.
The Fourth Paradigm of Science
In the era of Information, everything about science is changing. Experimental, theoretical and computational sciences are all being affected by this data deluge and a fourth, “data-intensive” science paradigm is emerging. We call this fourth paradigm of science as Reverse Informatics, the process of getting back the “source Data” from the primary scientific literature. In this whitepaper we will discuss the application of the concept of Reverse Informatics in scientific research. Parthys Reverse Informatics is one of the leading Information Research organizations, which supplies solutions for all the aspects in Drug Discovery Informatics including Cancerinformatics, Neuroinformatics, Cheminformatics, Pharmacoinformatics and Translational Informatics. Reverse Informatics’ three services include Literature Curation, Patent Analytics and Thought/Opinion Leaders' research.
Recent whitepapers include:
To promote your whitepaper with Bio-IT World or Clinical Informatics News, please contact: