White Paper

Leverage The Internet of Things (IoT) Within The Laboratory

 Leverage The Internet of Things (IoT) Within The Laboratory

Getting real value out of IoT it can be viewed as a ‘perfect storm’ in the sense that it presents “a rare combination of circumstances with the potential to result in an event of unusual magnitude.” IoT can be defined as physical objects that are connected to a network to collect data, like a fitness tracker recording personal activity data.

In the context of the laboratory, the direct interfacing of objects removes the need for manual data entry or transcription, which safeguards the integrity of the collected data, as well as aiding compliance with standards and regulations like 21 CFR Part 11. Efficiency is boosted through the removal of non-value-adding activity, while more data (and more complete data) of higher quality is collected, being attributable (who created a record and when), legible, contemporaneous, original and accurate.

In addition to providing a greater volume of complete data, the IoT gives an overview of processes and activity. Once analytics are applied, insight can be gained into trends or the effect of implemented changes, enabling organizations to act in a pre-emptive manner. When an organization has disconnected standalone equipment, however, it becomes difficult to obtain such an overview, especially when devices may be spread in various locations around the world. Questions can then arise as to whether each device is correctly calibrated, its maintenance status, where this information is documented, and how the data can be linked to that of the samples being tested in order to provide evidence that the instrument was in perfect working order during the time of measurement.

From a cost perspective, having detailed information regarding the location of equipment and how it is being used may reveal that instruments could be relocated, or used in a more efficient way rather than an organization needing to incur the cost of purchasing additional equipment. However, there are two key difficulties in leveraging this data; the first being that the documentation often remains in the form of paper records or Excel spreadsheets; and the second is the equipment being disconnected, making the access to data difficult.

In the lab, connecting both equipment and systems to the network is the most obvious point to address, as a lack of integration leads to manual steps in the process – and therefore a higher likelihood of error and increased compliance risk – as well as to large efforts to configure and maintain all the different applications.

Data capture is an important component within the drive for connectivity. Scientists should be able to enter data and observations directly into a mobile device, and to include instrument data directly through the automated transfer from the connected instrument. Other methods of capturing data within the laboratory include barcode scanners, which are primarily used for automatic identification of samples and chemicals, and radio frequency identifications (RFID) that use electromagnetic fields to automatically identify and track objects and small electronic devices. Likewise, the work of the scientists can be supported by wearable displays that offer a quick overview of a sample or alert the user when a sample needs to be measured.

Furthermore, technology such as biometric ID bracelets can boost efficiencies by simply helping the people in the lab to be automatically recognized. The entering of usernames and passwords is not only a non-value-added and time-consuming activity; it’s a task that can occur up to one hundred times per day.

Another option to leverage for IoT would be a heads-up display (like Google Glass) whereby an analyst can get additional information about the environment (augmented reality) by simply looking at the storage cabinet, freezer or sample, or by having a procedure displayed that needs to be followed. Finally, motion sensor technology can be applied to start processes or record an experiment so that at a later time it can be shared with other members of the team. By utilizing these technologies in a connected IoT way, organizations can streamline workflows and limit the amount of time being diverted away from the science.

Standardization must be considered from the perspectives of the data format as well as processes and methods. Without standardization, analysts must deal with a plethora of data formats that cannot be easily shared or leveraged, making collaboration, collation, accessibility and re-use of past knowledge difficult. When data is locked away in various systems and applications that are unable to communicate with each other, access to that data becomes a barrier for further analysis, and the opportunity to take past experiences and apply that information in knowledge-driven decision making is lost.

The IoLT enables the automatic transfer of data, helps organizations to immediately get an overview of the status of the equipment, samples and materials, people and activities, enables analytics, and allows real time data sharing and collaboration with peers.

However, in order to reach the ‘perfect storm’ there are elements other than the ‘things’ that need to come into play. As discussed, standardization is a significant factor – as are data storage, the technology that needs to converge to provide the optimum environment for the IoLT, and the analytics being applied to the data. Combined, these components offer a next generation laboratory experience.

Technology  Convergence
The laboratory informatics landscape within an organization can be complex, as often there isn’t just one single Laboratory Information Management System (LIMS) but several, and all for different purposes. Numerous Electronic Lab Notebooks (ELNs) may be used for different areas of applica- tion, and there could be a Laboratory Execution System (LES), Scientific Data Management System (SDMS), Chromatographic Data System (CDS), Enterprise Resource Planning (ERP) system, and a variety of different inventories. It becomes obvious that this typical scenario is not supporting the ability to leverage the data of the IoLT in a consistent and efficient manner. Harmonization and technology convergence that eliminates the duplication and overlap of functionalities are required, and a platform approach will support this. Users and scientists can then work consistently with one single user interface to capture, access, and review the data. With applications being connected to a platform, it becomes easier to share that data and collaborate on projects. A convergence of technology can therefore bring capabilities traditionally spread over multiple systems into one single system, avoiding the duplication of capabilities and data, and ensuring a single version of the truth.

Big Data
Another aspect is the amount of data being created. Conversations surrounding a large quantity of data invariably lead to ‘Big Data’, which means that organizations need to look into not only the volume of data created but also the variety and veracity of that data. The latter is assured by the IoLT because of the direct integration of the instruments and other items.
The large volume of data means that storage becomes a concern, especially when we consider whether the information is held with a high degree of organization, like a spreadsheet, or unstructured in that it has not been optimized in a pre-defined manner or applied to a predefined data model.

To address the issue of data variety, organizations have teamed up with pre-competitive alliances and consortia to find standard data formats, data models and ontologies. The Pistoia Alliance, for example, is a group of life science industry experts that want to address issues around aggregating, accessing and sharing data which are central to innovation. As a body, they have projects like Ontologies Mapping to standardize tools and metrologies, and to better integrate, understand and analyze the data more effectively. Another example is The Allotrope Foundation, an international association of Pharmaceutical and Biotech companies dedicated to building a laboratory framework in order to improve efficiency in data acquisition, archiving and management through a standardized approach.

Data Storage
In terms of storage, many organizations still have data that exists in silos that are relatively integrated but not entirely cohesive, as they are sometimes only accessible through proprietary vendor software and often utilize incompatible data formats. A modern alternative is an enterprise data lake, which stores structured and unstructured data together in an object-oriented flat architecture, making it much easier to scale and manage. By using data-lake technology and storing large amounts of information in the cloud, that data can be mined and re-used across different domains in an organization, independent of the system from which it was originally produced. Adding technology to access and re-use the data will then enable users to leverage the knowledge created throughout the organization.

Data Analytics
One final aspect to add to the IoLT is Big Data Analytics. The first consideration here is context – is the data scientific in nature, from collaboration or from production? The data needs to be contextualized through metadata. And equally important for an organization is to define what should be achieved with the analytics and which questions should be answered. This will then define the type of analytics being applied, and the method or technology for doing so. Those can be descriptive analytics (what has happened), diagnostic (why has it happened), predictive (what will happen) or prescriptive (what can we do to avoid or cause it to happen). To get the right answers to these questions specifically in the lab environment the system used needs to be science-aware in order to understand the data and their context.

Visions of environments leveraging the IoLT are not far away and elements of it can already be experienced today. We can expect laboratory experiences where analysts are automatically identified and can execute actions through gestures; where video can aid instantaneous collaboration and immediate information sharing; and where visual displays on eyewear offer information in a combination of identification, and augmented reality.

Forward-thinking organizations who want to transform the way they are working in the lab with the IoLT need to establish a comprehensive strategy supported by advanced informatics tools. Partnering with providers who share this vision and whose solutions are aligned with it will help find the best way to leverage the IoLT. Pragmatically applying new technology can turn this vision into reality, and lead to deep impacts throughout the organization.