A look at how video and computer vision can improve worker health and safety.
Author: Chris Penrose
Employees throughout the manufacturing industry filed hundreds of worker safety lawsuits in 2020. In fact, 68% of workers globally reported they do not feel completely safe at work, citing inadequate personal protective equipment (PPE) and employer noncompliance with sanitation protocols, mask wearing and temperature checks. This also resulted in OSHA issuing $3.8 million in penalties for COVID-19-related citations.
In order to keep workers safe and reduce penalties for noncompliance, organizations must adopt innovative technologies, such as video sensors and advanced analytics, and automate functions for employee health and safety monitoring. By automating health and safety monitoring, organizations can liberate themselves from relying on inefficient, manual processes or posted reminders to better enforce mandates and ensure employee safety at all times.
The standout among these new technologies is artificial intelligence (AI) enabled real-time video analytics, which allows organizations to automate health and safety monitoring. This will become mission critical as the amounts of video data are anticipated to increase as more employees return to in-person work amid the COVID-19 pandemic.
With digital transformation initiatives expected to consume 53% of IT budgets in the coming years, this capability will help organizations avoid expensive cloud data processing costs. By eliminating the need to send these ever-increasing amounts of Internet of Things (IoT) data to the cloud for processing, video-based AI technology can quickly and accurately produce actionable insights in real time for both cost savings and increased operational efficiency, ultimately increasing overall workplace safety.
Here are a few use cases demonstrating how video and computer vision can improve health and safety monitoring.
Before COVID-19, employees in industrial settings were already required to wear PPE, including hard hats, vests, gloves, eyewear, respirators and the like. Whether it is the employees’ or the organization’s neglect, oftentimes a lack of PPE usage is only flagged once an injury has already occurred—creating a liability for industrial organizations. Video monitoring combined with real-time analytics can thus reduce workplace injuries and increase productivity.
By ingesting live video streams from standard RGB cameras, computer vision technology can detect a lack of or improper PPE usage. Upon recognizing a safety violation, the video analytics platform can then generate email or SMS text alerts to notify safety supervisors. Live updates can also be published to dashboards at points of entry (e.g., the entrance to the manufacturing floor) to prevent a potentially dangerous situation, such as a worker attempting to enter a hard hat-designated area of the factory without one.
Workplace hazards can take many forms, such as moving cranes, falling debris, trip hazards, slippery surfaces and gas leaks. AI-enabled video analytics initiatives and video-based sensors can be customized to prevent any of these hazards, which could result in workplace injury and significantly reduce the number of liabilities for an organization. For example, RGB cameras can detect a liquid spill before a worker, or workers, slip and fall.
Similar to PPE detection, live video streams from standard RGB cameras can be paired with custom-designed deep learning models for organizations’ respective workplace hazards. Furthermore, the corresponding analytics platform can generate alerts via email or SMS text when a workplace hazard is identified. The platform can also publish live updates to points of entry, management consoles and other locations to notify safety supervisors in real time. This allows for timely action on a supervisor’s behalf to get the workplace hazard under control before it becomes a liability, such as temporarily closing down an area to clean up a spill.
Industrial organizations must ensure workers’ physical safety as well as their health. The real time power of video AI technology can also help mitigate the spread of infectious diseases within the workplace, such as COVID-19, the flu or the common cold. By pairing the ingestion of live video streams from thermal and RGB cameras with deep learning models, analytics platforms can also be programmed to detect elevated body temperatures (i.e., a fever), social distancing violations and proper facial covering or respirator usage.
Similar to the PPE and hazard monitoring, video AI-powered health monitoring systems can be customized to reflect an organization’s specific health regulations. Furthermore, organizations can integrate video AI health monitoring solutions with their existing access control and security systems, such as automatic doors and gates, to further mitigate the spread of illnesses in the workplace. For example, if a factory floor worker tries enters the building with an internal body temperature above 100 degrees Fahrenheit, thermal cameras will be able to detect their elevated temperature. By connecting video AI capabilities with their access control system, the automatic doors or gates leading to the factory floor will remain closed, thereby preventing the employee from entering — and therefore mitigating the spread of illness in the workplace.
Beyond Health and Safety Monitoring
Video AI capabilities are powered by edge computing. Edge computing processes data right at the edge—the site at which the data is produced—to derive insights in real time as opposed to sending data to the cloud for processing and back to the edge to execute actionable insights. When paired with AI, the value of edge computing for manufacturing goes beyond health and safety monitoring. Real time insights can also be leveraged for other areas of manufacturing, such as predictive maintenance and autonomous product defect detection systems.
However, the benefits of edge AI are not exclusive to the industrial sector. For instance, organizations within the energy industry can use real time actionable insights to optimize energy usage for sustainability initiatives. Other use cases include integrating edge analytics and video sensors into an automotive security system for driver facial recognition to increase safety, and oil and gas field workers combining edge AI and video sensors to monitor for dangerous gas leaks to adhere to industry emission regulations.
As efficiency and health and safety monitoring become more vital to the operations of every industry, edge computing and video AI will continue to empower organizations through valuable, actionable insights from streaming data.
Chris Penrose is COO of FogHorn, a developer of edge artificial intelligence software for industrial and commercial Internet of Things applications. Penrose has more than 30 years of experience in telecommunications and software. He was previously AT&T’s senior vice president of portfolio integration and partner services at AT&T Business Solutions.