Pioneering Cloud Computing for Clinical Trials

Aug 13, 2012 by     6 Comments    Posted under: New Technologies, Trends

The cloud: A new paradigm

The cloud generated $36.1 billion dollars in 2011 and is expected to reach $72.8 billion by 2015 according to a recent IDC study.  With a CAGR of 21% per year from 2011 to 2015, the cloud is growing three times faster than traditional IT infrastructures. It is nowadays a commonly known revolution, yet very few people grasp the nuances and the potential behind this term. In simple words, the cloud turns everything into easily accessible and affordable services, unleashing unmatched potentials for organizations and individuals.

Defining the Cloud

The cloud is the ability to access value-added services from anywhere at any time with a level of simplicity, flexibility and cost efficiency never met before. The cloud provides on demand access to software/applications, platforms and infrastructures commonly known as:

– Software-as-a-Service (SaaS) such as web hosting services, collaborative or CRM applications such as the well-known

– Platform-as-a-Service (PaaS), providing developers the tools to develop their own applications such as databases, operating systems, etc. without any initial costly IT investment in hardware. For example Google’s App Engine is a PaaS enabling developers to create new applications.

– Infrastructure-as-a-Service (IaaS) enables access to computer infrastructures such as servers, data-centers and network equipment, once again without any heavy initial investment. This type of cloud service is often used by organizations that have the IT expertise to manage their IT requirements but not the infrastructure itself. For example, Amazon’s Elastic Compute Cloud (EC2) provides resizable compute capacity to make web-scale computing easier without the need for CAPEX.

Those three categories are called “services” in the sense that users access, subscribe, use, monitor them on an on-demand and pay-as-you-go basis. The user can monitor the Service Level Agreement (SLA) he signed for, submit tickets if necessary, look at its IT usage bill on its own, without any human interaction. The cloud is thus a very automated, elastic and cost-efficient environment.

This level of autonomy is possible thanks to processes automation: Workflows are automatically processed in the cloud without human intervention. The advantage of process automation is cost-efficiency since less IT hours are billed. Today, around 70% of IT budgets are spent on maintaining the infrastructure, leaving only 30% for new projects.  This tends to frustrate departmental managers who see their projected queued, sometimes for years.  The cloud provides an immediate and cost effective solution while empowering these managers.

Clouds rather than Cloud

Traditionally the cloud is split into four types:

Read more about Clinovo cloud-based hosting services

Cloud technology in the life science industry 

Clinical trial professionals already use public clouds but mostly for administrative, IT, marketing or sales purposes (such as Google Drive, any document sharing system or CRM tools) but very few of the cloud services are directly related to life science.

Although cloud-based systems are gaining momentum in almost all the industries, the adoption rates for this innovative technology remain low in the life science industry. Some IT vendors in clinical trials such as Medidata Rave are arguing they are offering cloud services, whereas their services are neither self-service nor on a pay-as-you-go basis.  This is not uncommon; many companies exploit the cloud marketing buzz, yet provide services that are not self-service, automated, flexible nor cost-efficient.

In clinical trials, cloud technologies are a new opportunity to lower skyrocketing costs. Electronic Data Capture (EDC) systems, Clinical Trial Management Systems (CTMS) or ePRO systems would be configured and implemented at a much faster pace and at a much lower cost. In January 2012, Forbes calculated the average cost of bringing a new drug to market at $1.3 billion (at times $4B to $11B for big pharmaceutical companies), this calculation takes failed drug application in account.

Thanks to the always-on and automated properties of the cloud, drug development cost is bound to decrease since clinical trials will be started and ended faster than ever before.

Upcoming challenges

One of the major concerns of pioneering cloud computing for the healthcare industry is compliancy. Pharmaceutical companies must ensure that the cloud service providers they use follow GCP as guided by 21 CFR Part 11 regulation, to ensure the system is fit for its intended use;  including IP/IQ (Installation protocol and qualification), OQ (Operation qualification) and PQ (Performance Qualification).  Here are some tips about validating a clinical application in the cloud:

  • You must have an installation protocol to install the application into the cloud; as well as for every minor and major version upgrade.
  • In a public cloud you cannot have an installation protocol for installation of the hardware and OS images.  More and more auditors understand and accept this is a limitation of the cloud.  Do check with your QA department, if in doubt.
  • You must provide test and production environments for each application in the cloud.
  • You must test backup and restore of all production applications.
  • It is a good idea to test your disaster recovery procedures.  You may need the cooperation of your cloud provider to simulate a disaster for you.
  • Validation of the application must take place in the cloud and you must use the same documentation and methods as if the application was running on a local server.

Since clinical trials are more and more international, there is also a need to ensure that local regulations are followed. For example it is essential to know where the data is hosted. Indeed some countries require the clinical data to be hosted in the actual country of the clinical trial. For example, if a pharmaceutical company runs a clinical trial both in the US and in Japan, the Japanese data must be hosted in Japan. This regulation should be taken in consideration while implementing a global cloud-based clinical system.

Even though the cloud is promising autonomy, flexibility and cost-efficiency for pharmaceutical companies, there is a need for experts to ensure that the transition to cloud-based services for clinical trials is made in a safe and compliant manner. IT and life science are two very different areas of expertise, so it is critical to take the time to choose a vendor that has proved its worth in both area and that can guide you through this new technology.

Ultimately, the cloud technology will revolutionize the healthcare and life science industries, enabling pharmaceutical companies to bring their drug to patients faster at a lower cost.

At Clinovo, we pride ourselves to seek and bring the most innovative technologies and apply them to the life science industry to streamline clinical trials. Our team is composed of experts in both the IT industry and the life science industry

Marc Desgrousilliers, Chief Technology Officer at Clinovo

Olivier Roth, Marketing & Communication Coordinator


Subsequent to this article Marc presented ‘Clinical Trials in the Cloud: A New Paradigm?’ at SCDM 2013 where it won second best presentation of the entire event. Click here to watch the full presentation.

6 Comments + Add Comment

  • How is cloud computing addressing the main problem of EDCs – quality of clinical data?

  • Hello Mitch, this is an interesting question. Quality is the ultimate goal of any clinical trial, since clinical trials seek the scientific evidence of the safety and efficacy of a drug or medical device. To me the fact that the cloud offers less human intervention will limit human errors. The cloud empowers data man agers and lowers the number of stakeholders in the data manipulation. Cloud systems will be more modular, which will give the opportunity for cloud- based platforms to integrates the best modules, tested and approved by the market on platforms that could look like appstores. It will make EDC more adaptable and flexible, and clients will be able to choose on an on demand basis what modules they need, and only what they need.

  • Oliver – don’t bite on Mitch’s question. EDC’s biggest problem is not data quality.

    On your initiatives: which cloud provider are you recommending and why? I am sincerely interested in your views.

    Why do you think you need to have the cloud provider physically participate in your disaster recovery/business continuity testing? I am pretty familiar with the Microsoft Azure and Amazon Web Service AWS offerings and, as you know, “dropping” a physical device does nothing to availability. In fact, at any given time, in a real cloud not a co-hosting facility, you may not know what machine your instance is physically located on. (You can specify region – to your Japanese example and, I think, the French require something similar) but have absolutely no control over the physical machine, nor should you want it. That is the beauty of true cloud architecture. AWS and Microsoft Azure (and the others but many, like DropBox are actually built on AWS) are based on a redundant array of inexpensive devices (RAID) not only for storage but taken to the next level by providing redundancy in their computation hardware (EC2 at AWS, Microsoft Azure is itself a redundant operating system – beautiful).

    A High Availability instance that exceeds anything a single co-hosted or self-hosted installation can be built at a fraction of the cost. For example: on AWS the Server stack containing the web server and programming installed on a Virtual Machine (AMI or Amazon machine image) is already redundant and geo-located if you select that option. Then the database is redundant and geo-located if you use ‘native’ storage from Amazon. A backup copy of the server and programming could be archived in the Amazon ‘vault’ for low cost long term storage. You could even store every single version of your stack that you have deployed to production (all core apps plus patches, upgrades, etc.). Keep a copy at your own premises, for a warm and fuzzy feeling. Periodically grab a snapshot of the already redundant data and store it in the vault. Geo Locate your vault storage.

    Now, the only single failure that could cause an interruption of service (due to a natural disaster, loss of power, etc.) is a virus or software bug unique to the AMI you are using. It would supposedly spread to all connected instances with the same vulnerability. This can be mitigated by storing a copy of the original stack (VM or AMI) as configured for production, on Microsoft Azure. Reverse all of the above if you start with Microsoft Azure (store the image on AWS). Now, no single failure or natural disaster can interrupt service for any longer than it takes to copy the image from one cloud to the other.

    Of course, in that instance the single point of failure is the “last mile” to the client. The reliability and availability here is self evident – move to a different access point if you must.

    Finally – I congratulate you guys on your efforts to bring the industry into the 21st century and wish you all the luck you deserve. Face it, open source, cloud and even ‘big data’ or ‘noSQL’ are concepts punishable by cyber-slamming so please stay the course.

  • Mike, thank you for your input and your great questions. For us, data quality will always be extremely important, as it is the ultimate goal of any clinical trials. To answer your question about the best Cloud Providers, I need to know more about your intended use.
    I heartedly agree with your comments when applied to Public Clouds. However, if you looking to become a Cloud service provider, I believe that building your own infrastructure maybe ultimately cheaper than paying as-you-go for IaaS, depending on the expected load. Our research shows that the crossing point is at around 300 virtual machines, running 24×7 (roughly 10 physical servers). For this reason, Clinovo has decided to build its own private/community cloud to offer EDC as a Service.
    We recommend avoiding PaaS to store/process clinical data because the Cloud Service Provider, not you, controls your environment, the supported platforms, the frequency of updates and even your level of data protection. For these reasons we chose to develop our application on our own platform in order to offer a stable, secure environment that is fully validated and compliant with Title 21 Code of Federal Regulations Part 11 (21 CRF Part 11).

    When considering SaaS (which is basically what Clinovo offers for EDC), it is the Sponsor’s responsibility to audit the Cloud Service Provider to ensure their services are 21 CRF part 11 compliant and their IT services meet SSAE 16 (SOC 1 and 2) or SAS-70 standards, including planning for disaster recovery and testing the plan periodically.

    Your question on physical involvement is interesting. 21 CFR Part 11 mandates that your data be secure and protected, including rigorous controls on the facility hosting the physical server. I am sure you are aware that during validation, the Installation Protocol must be specific to the server, including its serial number. You will need to point to the server containing the data during an audit. Also, in order to test your disaster recovery plan, you will need to physically, or remotely through IPMI, shutdown the physical server(s). Your friendly QA auditor will ask you where the data reside and if your answer is “in the Cloud” you are likely to fail your audit.

  • VMware Certified Advanced Professional 6 (Desktop and Mobility Deployment) – The industry-recognized VCAP6-DTM Deploy certification validates that you know how to deploy and optimize VMware Horizon 6 (with View) environments. It demonstrates that you have the knowledge and abilities necessary to leverage best practices to provide a scalable and reliable Business Mobility platform for your business. Some of the subjects include: Configuring and managing Horizon View components, configuring cloud pod archituecture, configuring Group Policy settings related to Horizon View, Configuring and optimizing desktop images for Horizon View & Mirage, Configuring and managing App Volumes AppStacks, Configuring desktop pools, Configuring and deploying ThinApp packaged applications, Configuring VMWare Identity Manager, was designed by Sebastian to spread his enthusiasm for PowerShell & VMWare. Sebastian is an IT expert employed in Singapore for longer than 15 years who’s all of the time interested in new solutions to sharpen his technical competencies & knowledge. Ever since, Sebastian has become a member of PowerShell User Group & VMWare VMug group, and has also been being involved in all the meetings held in Singapore. This web site will show you how Sebastian manage to automate some of his every day jobs with PowerShell. There are study guides created for the VCAP6-DTM examination, which were personally created by Sebastian. Sebastian is qualified with VCAP6-DTM, and is experienced with virtualization & server maintenance from 4 years experience of automation. The call for VMWare prepared admins and technicians are ever-increasing with the current economic tech industry. Get acquainted with much more about PowerShell & VMWare at!

  • VMware Certified Advanced Professional 6 (Desktop and Mobility Deployment) – The industry-recognized VCAP6-DTM Deploy certification validates that you know how to deploy and optimize VMware Horizon 6 (with View) environments. It demonstrates that you have the knowledge and skills essential to leverage best practices to provide a scalable and dependable Business Mobility platform for your business. Some of the subjects involve: Configuring and managing Horizon View components, configuring cloud pod archituecture, configuring Group Policy settings related to Horizon View, Configuring and optimizing desktop images for Horizon View & Mirage, Configuring and managing App Volumes AppStacks, Configuring desktop pools, Configuring and deploying ThinApp packaged applications, Configuring VMWare Identity Manager, etc.Sebastian’s take on the VCAP6 exam: “In my own viewpoint VCAP6 exam is way better experience compared to VCAP5, the new exam appears to be just like VMware HOL. The screen is a breeze, questions are structured on the right area of the screen, and can be concealed aside or even restored when necessary. My bits of advice to the questions windowpane: if you would like make it floating, you better know how to restore it back. I ended up shifting it all around for the reason that I forget how to restore it back. The 2 arrows that appeared to be buttons on the top were designed to dock the window to right of left. Fonts could be resized, which from my opinion was better than scrolling up and down the question. The response speed of the whole user interface was so a lot faster compared to VCAP5.5, and there wasn’t any lagging time experienced when changing from window to window. One thing to bear in mind: BACKSPACE key is not working! I believe this is good because you don’t reload your exam window by accident, nonetheless, it could be annoying occasionally when you type some thing mistakenly and you have to select and press Del to remove. The Desktop and shortcuts were arranged quite well, and necessary applications like web browser or Mirage console could be launched. You will find there’s decent user interface for Remote Desktop Manager and you’ll discover all required RDP connection to servers or desktops with no need to type account information. The web browser had all the links in the Favorite Bar. At the time I am writing this, there’s no additional 30 minute extension for Non-Native English speaker at No-Native English country, which is actually a bummer. There are actually thirty-nine question to fill out within the 3 hours period, which can be actually quite hard for non-native English speakers just like me. Several questions take time to complete, making it preferable to skip out on the questions that you can’t answer, and finish those you can. After the thirty-nine questions, you’ll be able to get back to the uncompleted questions should you have time. Never waste a long time on one single question! The exam blue print is found on my blog at It is well organized and following it for the exam preparation will help a great deal. Needless to say, the most beneficial is if you could have lots of hands on experience! I’m in fact pretty pleased with the examination experience, though I passed this time around by small margin, but I know very well what I missed for the examination, learn from the errors and practice harder to familiarise myself with the environment. This credentials certainly will open up your job prospects!”

Got anything to say? Go ahead and leave a comment!

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>


By signing up, you agree to our Terms of Service and Privacy Policy.


[shareaholic app='recommendations' id='5045317']