by Spencer Lamb, Kao Data
4 February 2020
Pharmaceutical, biotech and medical research markets have been quick to understand the significance of big data and fast on the uptake of artificial intelligence (AI), machine learning (ML) and deep learning (DL).
Today they leverage the technologies to discover new ways to combat illness, disease and allergies, many of which are being exacerbated by the increasing impact of global warming.
The Asthma and Allergy Foundation of America states that climate change and temperature increases, and the accompanying water and air pollution, will lead to ground-level ozone pollution and can cause or aggravate chronic respiratory disease, airway inflammation, and damage to lung tissue.
To battle diseases such as these, many research bodies today utilize High Performance Computing (HPC) solutions, and with them, graphic processing units (GPUs), reducing processing timeframes from weeks and months to days or even hours, enabling research results to be verified all the more quickly. Yet with more focus on emissions and sustainability, the laboratories and pharma organizations undertaking this research must begin to power their HPC workloads with renewable energy, or risk contributing to the very problem they’re trying to solve.
The bridge between
now and the future
Traditionally, HPC facilities have remained on-premises, near to the research function, but for the pharma industry itself, colocation data centers can also deliver a platform for cost-effective data processing, playing home to HPC infrastructure and enabling access to renewable energy sources, which, with the focus on sustainability, has moved straight to the top of the corporate agenda.
While genomic sequencing and data analysis hold the key to unlocking new
research, treatments and cures, it is the journey of data production that
increasing numbers of pharmaceutical, medical and data scientists are reviewing
critically, as a younger and more ecologically aware group of professionals reach
positions of influence within their organizations.
For the next generation of decision makers, the impact of their day-to-day
choices on the environment has become a paramount consideration. Moreover, it
is now even more important for them to know that the companies with which they
choose to form long-term, strategic relationships understand the criticality of
corporate social responsibility (CSR), of carbon emissions and Net Zero, as
well as the key business factors that corporate organizations hold close.
The media has reported that many of the world’s leading consumer and technology brands have committed to Net Zero targets, driving sustainability throughout the supply chain. However, the World Economic Forum reports that just 67 countries and eight US states currently have a net-zero ambition. Furthermore, in 2019, 43% of CDP Supply Chain program members confirmed that they are currently deselecting suppliers based on their environmental performance, or lack of it. Therefore, it would not be too harsh a prediction to say that companies who are not focused on sustainability will begin to lose market share and the ability to transact.
Considerations for outsourcing
Responsible use of AI technologies has undoubtedly become a key discussion point in recent times, as has the ethical use of data. Yet no single company has considered what is powering the high-density GPU systems delivering the neural network training, or what impact that level of energy consumption has on the environment.
centers are undoubtedly complex buildings, using vast amounts of power, cooling
and connectivity. More recently, with social media and news outlets focusing on
climate change and impact of emissions on the environment, the technology industry’s
increasing energy use has become a key focal point. Yet the exponential growth and
global demand for data can only be achieved though the development of advanced data
center networks. Without them, all of the digital services for which many
depend, would not be possible.
So how do we begin to balance the challenge of reduced emissions with demand for digital services? What is required now are energy efficient, zero carbon facilities that remove the negative impact of CO2, while providing cost-effective platforms for big data processing and high compute throughput.
together with pharmaceutical organizations are creating innovative solutions to
solving medical problems on a global scale, but do data center operators deliver
efficient platforms to provide them with resilient, computational support?
At Kao Data, we believe there are seven key factors that continue to
influence the more eco-aware decision makers among us and highlight areas that
offer the bedrock of strong progressive data center customer relationships:
- Connectivity - Research organizations require
access to data sets, not only in the UK, but also on a global basis. Connectivity
is the lifeblood of a data center, so your chosen operator must provide
ultra-fast connections to scientific, public and private suppliers.
- Sustainability and the environment - With sustainability at
the top of the corporate agenda, adherence to processes and standards is
essential. Operators should demonstrate their commitment to eliminating waste
and targeting ‘zero emissions’.
- Net Zero - Many modern data centers
are improving their emissions models through use of refrigerant-free cooling and
renewable energy procurement.
- Cooling - 2019 saw the first
average rise in Power Usage Effectiveness (PUE), used to rate the power
efficiency of a data center, since the measure was introduced; directing more power
to the GPU and less to the cooling solution can reduce energy consumption and deliver
significant savings. Technologies such as liquid cooling can be a key ally in
the quest for lower energy use.
- Power/Energy - Across the UK, power
availability is a limiting factor in the location of data centers. Guaranteed access
to energy and the capability to expand operations is essential to any forward-thinking
operator. In today’s business environment the ability to scale with an organization
can make or break a relationship.
- Location and existing customers - Research organizations will
need regular access to the data center, so location also plays a second crucial
role. If an operator is placed within the UK Innovation Corridor and already works
with biotech and
pharmaceutical organizations, they may well be the
- Design - Today, many data centers
are built to a corporate design with limited consideration to the location or
customer profiles. Hyperscale-inspired sites are designed with performance and
efficiency as key principles, and some may be Open Compute Project (OCP) certified.
There are very few colocation facilities that meet such a high standard or
offer the infrastructure required to support high performance computation.
more than ever, biotech and pharmaceutical organizations need scalability,
access to resilient, renewable power, high performance storage and processing. Modern data centers, of
course, provide the critical infrastructure for AI and ML applications, but not
all legacy facilities can support this requirement.
striking a balance between being sustainable, whilst providing rapid
connectivity and access to energy, is essential for the Pharma industry. When the
requirement for 100% renewable power and focus on sustainability is met, the relationship
between a data center operator and a research organization will flourish, with
great benefit to all.
Spencer Lamb is the Vice President of Sales & Marketing at Kao Data. Having held previous positions at Infinity SDC and Verne Global, he brings over 26 years of experience in data centers, High Performance Computing (HPC) applications, Artificial Intelligence (AI), cloud and telco to the business.