Whether you have a plan that is ready for execution or need help planning your transformation journey, Hargrove is here to help you succeed. Our Team will work with you to determine which digital technologies will have the greatest long-term value for your operation’s efficiency and profitability.
The Hargrove Controls & Automation Team has been helping industrial plants and manufacturers with production and asset optimization for decades. Beyond experience in new digital technologies, our Team’s vast knowledge and skill in both new and legacy distributed control system (DCS), network architecture, and production needs of process industries provide the experience to know what data points need to be collected and analyzed to capture the benefits and ROI of investing in digital technologies.
We help clients with both roadmap planning and digitalization implementation.
A digital twin utilizes connected smart technologies to create a digital replica of physical assets and processes. Until recently, implementation of a Digital Twin in a cost-effective manner was in many cases not feasible. IIoT, Big Data, and companies such as Google – who open source their AI platforms – have opened the door to practical application and implementation of digital twin technology.
The online operational digital twin is based on cumulative, real-time, real-world data measurements across an array of dimensions. These measurements create an evolving profile of your assets and processes in the digital world for historical, current, and future behavior of the physical plant assets and processes occurring within that plant. The digital twin serves as a model of asset health to forecast and recommend action for informed decision making and avoidance of asset failures. With a digital twin, you can safely explore “what if” scenarios without putting people or the asset at risk. The digital twin can provide important insights upstream and downstream to identify opportunities to fine tune system performance, manufacturing processes, and maintenance execution.
Hargrove Controls & Automation understands the practical application of digital twin technology. Our Team can help you create a valuable model of your plant’s physical asset to monitor behavior and performance for optimization opportunities. Here is a video from one of our technology partners that demonstrates the value of digital twins.
Our Team works with the top digital transformation vendor platforms in the market. We are a Delivery Service Partner (DSP) with Aspen Technology, Inc., one of the leading Industrial AI software companies with products for the engineering, operations, and supply chain areas. We are partnered with Noodle.ai, an advanced cloud-based AI company focused on delivering value to industrial clients through a unique insights-as-a-service model. Our Team also has experience implementing other digitalization solutions, so regardless of your needs, we can provide the right fit for your company and facility.
Digitalization refers to the generation and use of data in a digital native form, often from multiple sources. It is a core component of the Industry 4.0 framework.
Digitization is the conversion of data into digital form. This could be the scanning of paper documents, manual data entry of log sheet recordings, etc. It is different from digitalization which involves native digital data processing. Digitization unlocks value from paper records but can be time consuming and error prone to convert into digital data.
A digital twin is a digital model of a physical asset or process. The ultimate goal in industry would be to have a digital replica of an entire plant that represents all of the installed equipment and is capable of reproducing all of the behaviors of that plant. This would allow for experimentation and optimization of the digital twin simulation that would provide insights into possible improvements to the physical plant. In today’s reality, digital twins are more focused and specialized, with multiple digital twins replicating certain features and processes. Engineering digital twins attempt to inventory and organize all the assets in a plant and share that data among all disciplines; however, they rarely have the ability to replicate process behavior. Process digital twins are built to replicate the chemical and physical behaviors of plants to determine design details but are simplified representations of the equipment. Other digital twins may include 3D design, finite analysis, CFD (Computational Fluid Dynamics), or multi-variate predictive models for individual assets.
Industry 4.0 refers to the fourth Industrial Revolution, sometimes abbreviate 4IR. The term is commonly attributed to Klaus Schwab, the founder of the World Economic Forum around 2015. Industry 4.0 is an umbrella term referring to many software and hardware technologies but are generally characterized by increased data connectivity between systems, the application of AI/ML, cloud computing, IoT, 5G communication, 3D printing, augmented reality, etc. Other associated terms are Smart Manufacturing, Smart Industry, Smart Factory, and Factory of the Future.
In chronological order, the generally recognized industrial revolutions were:
1st Industrial Revolution / Industry 1.0 – 1780s: Mechanization, water and steam power, interchangeable parts
2nd Industrial Revolution / Industry 2.0 – 1870s: Electrification, decoupling of factories from power source, assembly lines, motors, relays
3rd Industrial Revolution / Industry 3.0 – 1970s: Automation, electronics, computerization, semiconductors
4th Industrial Revolution / Industry 4.0 – 2010s: Artificial Intelligence, self-learning systems, bridging of physical and digital worlds
The process historian is usually at the center of any Industry 4.0 initiative and shouldn’t be taken for granted. An industrial process historian, sometimes called a PIMS (Process Information Management System) is a specialized database designed for the efficient collection and storage of time series data. General purpose databases like SQL and Oracle are designed to store many data formats, often in many interconnected tables. Process historians are designed to store relatively simple data formats, but often use compression techniques to efficiently store the data. How a process historian collects the data is also important. Process historians are designed to communicate using numerous standard and proprietary protocols to collect data from both common and obscure PLC and DCS platforms. They often also offer redundant and/or buffered communication options. It is not uncommon for process historian packages to come bundled with a more traditional database application like SQL to store other plant and equipment information outside of process data, further demonstrating that the two types of databases are not interchangeable. Lastly, process historians nearly always come bundled with a variety of trend viewing and analysis packages. For many clients, these trend viewing packages are the face of the product and can substantially influence buying decisions.
AI refers to Artificial Intelligence, mimicking the natural intelligence of humans or animals in computerized systems. ML refers to machine learning, mimicking the way organic brains learn from their surroundings in a computer. ML is a subset of AI, but since many AI software applications utilize ML, the terms are sometimes used in combination or interchangeably (despite the distinction). ML algorithms often use neural network programming to statistically match input patterns with outcomes to either classify outputs or predict future events without the need for complex application specific programming using traditional If/Then logic that requires the programmer to think of all scenarios that may be encountered in the future.
Multi-Variate Analysis applications are a wide range of software applications that use statistical methods to analyze multiple input parameters at once. They are particularly well suited to the “big data” approaches often used in Industry 4.0 concepts where large volumes of data are fed into applications that will decide which parameters are important and which can be ignored to achieve a desired outcome. Many statistical tools fall into this category including neural networks, and Monte Carlo simulations.
Neural network models, also known as artificial neural networks (ANNs or just NNs), are a type of machine learning model that mimics the neurons and synapses of organic brains. By training an NN on historic patterns and outcomes (supervised learning), NNs can learn to match new data to trained patterns and classify them or predict future events. Several classes of industrial AI software use neural network programming including predictive quality, predictive equipment reliability, and machine vision applications. NNs can also be trained via unsupervised learning in some cases where the application automatically compares predicted outcomes to observed outcomes and self-corrects. Many industrial NNs are only three layers deep and a dozen or so neurons in size, far simpler than applications like ChatGPT or even an insect brain. Despite appearances, most industrial NNs do not understand the process they are examining. They are only looking at statistical correlations between input patterns and output patterns.
With Hargrove, you get the right experience from the right people in system integration working alongside you to meet and exceed your expectations. Working together as one team – that’s Hargrove.
Hargrove Controls & Automation’s Chet Barton, P.E. Awarded Functional Safety Expert CertificateSee More
MACE RECOGNIZES HARGROVE TEAMMATES IN 2024 AWARDSSee More
Hargrove Controls & Automation’s Karen Griffin, P.E., and Heath Stephens, P.E., Achieve TÜV Rheinland Functional Safety Engineer CertificationSee More