Mar
17

Digital Twins Are Changing The Grid

Twinning is the process of linking the physical world and virtual reality with amazing results. 

Digital technology is getting more dynamic every day and harder to understand, but sometimes the most advanced technologies are the result of timing. It's connecting the dots, or perhaps bridging the gap because of the ability to understand the data faster or the flexibility to understand what the technology is saying. The non-technical person may talk about stars aligning or a perfect storm of events, but that isn't the case here. It is taking the old "thinking outside the box" approach to a new level by grabbing existing applications and integrating them into a different function, which is where digital twinning comes in.

The digital twin was introduced almost two decades ago, but some say the concept dates back even further to the period when the first computer-aided design (CAD) systems came into the engineering department. As CAD software matured, engineers were able to develop 3D models of the what they were designing. When combined with automation, the engineers could see how their designs worked. It gave them the ability to see simulations of the devices before they were built.

With that type of a tool, it wasn't long before engineers started asking, "What if we could monitor the actual equipment?" Maybe they could monitor the health of the device or identify problem areas, or improve efficiencies. The potential was there, and it attracted a great deal of attention. A lot of things fell into place, and keeping it simple, these 3D CAD models evolved into the early digital twin theory.

Collateral Improvements

Smart technology with its intelligent sensors and transducers moved theory into the real world. These devices needed to become markedly more sophisticated, substantially smaller, and much cheaper, which they did. This promoted the concept of interconnectivity and fed the development of sophisticated communications systems such as today's 5G technology. In this environment, the Industrial Internet of Things (IIoT) technology became possible and brought about dynamic monitoring and controlling of industrial assets and processes.

It helped that HPC (high performance computing) was developed and led the way to new applications like the cloud infrastructure, which was an ideal environment for the big-data these systems generated. This setting is making data storage cheaper and more available to the entire enterprise. It is also a boost to big data analytics and the spreading of asset simulations integrated with artificial intelligence (AI) and augmented reality. Overall, this combination of the physical world with smart technology is being called Industry 4.0, but that subject covers a flock of interesting topics that needs exploration, and like eating the proverbial elephant, digital twinning will be our first bite.

What is Digital Twin?

The digital twin has been compared to a bridge between the real world and the virtual world that has produced tangible tools for the heavy industry. Granted, that tactic is really a simplified summation, but it reflects how everything in the digital technology realm is interrelated in one way or another. Before moving on with the digital twinning discussion, it is important to define exactly what digital twins are. Typically a digital twin is compared to a digital copy of physical assets, but that description only scratches the surface and a digital twin is a lot more than that characterization.

To quote GE Digital, "Digital twins are software representations of assets and processes that are used to understand, predict, and optimize performance in order to achieve improved business outcomes. Digital twins consist of three com- ponents: a data model, a set of analytics or algorithms, and knowledge."

The digital twin technology is being used by many industries such as aerospace, defense, healthcare, transportation, manufacturing, and energy. Heck, it's even been used Formula 1 racing for several years. Basically more end users are coming onboard all the time and the list of major players in the market grows every day too. This includes companies such as ABB, Accenture, Cisco, Dassault Systèmes, General Electric, IBM, Microsoft, Oracle, Schneider Electric, and Siemens to name a few.

It is definitely a growth market and a quick check shows some interesting figures. Depending on which study is read or which expert is quoted, the global marketplace was about US$3.8 billion in 2019 and the projected growth is estimated to range from US$35 billion to US$40 billion by 2025 at a CAGR (Compounded Annual Growth Rate) anywhere from 37% to 40%. No matter which figures are picked, the common denominator is the market is growing and it's growing at an attention getting pace.

Growth is being driven by the benefits digital twin technology offers such as asset management, real-time remote monitoring, real-time and predictive performance evaluation, predictive equipment failure, and other money saving advantages. For the grid, probably one of the most promising digital twin features is improved reliability and resiliency by more situational awareness. Being able to mine big-data for actionable information has proven helpful predicting delays or unplanned downtime. The takeaway for any business is simple, there is a digital twin in its future.

Need For Standards

That said, the power delivery system hasn't been the quickest industry to deploy digital twin. Cloud-based applications like digital twinning bring the challenge of selecting correct data, the validity of the model, maintaining the process, and cybersecurity threats to name a few items. There are also some very real interoperability concerns (i.e., the digital twins from one supplier may not play well with digital twins from another supplier).

There are no standardized digital twin platforms, and that is a major speed bump for widespread digital twin deployment by utilities. It's not hard to imagine a utility or several interconnected utilities having a gaggle of digital twins that will not operate together. It is reminiscent of the early days of smart grid when intelligent electronic devices (IEDs) with peer-to-peer protocols were being introduced.

In those early days, IEDs offered amazing features and benefits, but only a few utilities took advantage because it meant sole-sourcing one supplier, and that kept most utilities on the sidelines when it came to deployment. It didn't take long for all the stakeholders to get behind the development of vendor-agnostic interoperability standards such as IEC-61850. It was hard work, but the results speak for themselves. IEDs have developed into plug-and-play systems that are in use around the world and that needs to happen in digital twinning, but let's look at some examples of digital twin use.

Digital Twin Projects

Back in 2015, GE Renewables introduced the first digital wind farm to the world. The turbines had sensors and transducers throughout their assemblies monitoring how each turbine was working. These monitoring devices sent big-data to a remote operations center where the digital twin powered by GE's Predix software provided visualizations and advanced analytics for the operators. Today GE reports it has more than 15,000 wind turbines operating in the digital twin mode.

American Electric Power (AEP) recently announced it has contracted with Siemens to provide a digital twin of their transmission system. Siemens reported, "The AEP project is the largest and most complex to date, partly because AEP's presence extends from Virginia to Texas. Not only is the digital twin enhancing the utility's previous data governance strategy, the system has to be flexible enough to accommodate its continued evolution by allowing 40 AEP planners in five states access to the model and to make changes as needed, too."

Siemens also said, "AEP also wanted a system to help it automatically perform functions that up to now have been executed manually, such as assuring data compliance with the number of regulatory agencies in the eleven states it serves. The system will ensure reliability and reduce outages in a network that consists of conductors (cables) made of different physical materials spanning varying topographies and differing climates."

According to a press release from Principle Power, the Department of Energy (DOE) has given a US$3.6 million grant to a consortium of partners led by Principle Power including Akselos. SA, American Bureau of Shipping, University of California Berkeley and others. The funding will be used to develop, validate, and operate DIGIFLOAT, the world's first digital twin software designed for floating offshore wind farms on the WindFloat Atlantic project.

Another recent press release announced Nation Grid was partnering with Utilidata and Sense to create a pilot project that is a first of-a-kind digital twin application. It's a virtual model that will represent an "end-to-end image of their electric grid. It will be capable of mapping power flow, voltage, and infrastructure from the substation into the home. The goal is to demonstrate the value of real-time data across the grid.

Digital twinning is making inroads into the electric grid and that isn't surprising. After all controlling the grid is all about data and being able to act on it. To paraphrase some experts, those failing to take advantage of digital twins will be left behind.

Continue reading
  49 Hits
49 Hits
Feb
17

Has Smart Grid Technology Impacted Utility Fatality Rates or Job Numbers?

The U.S. Bureau of Labor Statistics (BLS) recently reported there were 5250 fatal work injuries recorded in the United States in 2018, representing a 2% increase from the 5147 in 2017. Although there is no acceptable number of fatalities, the 44 deaths in 2018 in the North America Industry Classification System (NAICS) utilities sector were well below numerous other sectors. For all job classifications, homicides, roadway incidents, falls, and being struck by an object each resulted in more deaths than exposure to electricity. Is the lower fatality incident rate for utilities (2.6/100,000) compared with the construction industry (9.5/100,000) — which includes trade electricians — because of the implementation of digital and other smart grid technologies? Also, does the developing smart grid era portend fewer or more electric utility jobs?

BLS data historically identified being an electrician as one of the 10 most dangerous jobs. The data issued to date for 2018 provides a more nuanced story. Fatalities in the utilities sector, including all types of utility workers, were relatively low. Fatalities in the construction and extraction occupational sectors (see graphic) were high and within the 10 most dangerous classifications. The construction sector (NAISCS 23) includes specialty trade contractors (NAISICS 238), which covers electricians. While sector breakdowns do not include the number of fatalities by occupation, a straight percentage allocation of the total would indicate 160 electricians working as construction subcontractors lost their lives in 2018. To be clear, this is an estimate for comparison with the 44 utility employee deaths.

There are a host of job differences when comparing utility workers with electricians working as subcontractors, frequently on construction sites. One thought-provoking concept is that smart grid systems at utilities are affecting fatality rates by reducing or eliminating some of the most hazardous tasks. Consider advanced distribution management systems that can clear some grid faults and avoid having line workers make unnecessary trips during inclement weather. Also, look at the reduced exposure to energized cables resulting from the digitalization of substations and other grid components. The smart grid initiative also includes an increased emphasis on renewable energy and measures such as demand management. Such technologies may limit the transmission and distribution (T&D) of power, but they also place electrical systems in work settings where staff may not be adequately trained. However, there is no compelling evidence in literature to date indicating that smart grid technologies are impacting the incidence rate of workplace fatalities, even though some of the arguments both ways are logical.

Most electric utility employees don't spend their time worrying about the singularity — a hypophysis concerning when technological advancement will overtake and potentially eliminate humankind. However, some may worry about technological advancements eliminating their jobs. It's clear from our experience to date that some roles are becoming less essential while others will be created or a will have a higher priority. Consider the reduced need for meter readers with the adoption of advanced metering infrastructure (AMI) or the decline in customer service operators with automated response systems. Conversely, look at the increased and new roles in information services, analytics, and communications technology.

BLS data indicate utility worker jobs have declined by only 2.3% in the 10 years since the beginning of 2009. This is the same period during which we've seen huge investment in smart grid infrastructure by the electric industry. However, reviewing the data timeline below, one might argue the worker decline is more a vestige of the great recession which began in 2008 than a result of the adoption of smart grid technology.


Source: BLS — Employment in the Utilities Sector

A detailed assessment released nine years ago by the Illinois Institute of Technology (IIT) and West Monroe Partners predicted more than 100 job classifications in a range of businesses and industry subsectors would be affected by the expansion of smart grid technology. The study identified gaps between existing skills and competency levels relative to those needed for the transformation of the power industry. Further, it stated that the smart grid would bring new job duties, titles, and roles to the power industry, but stopped short of finding a major workforce expansion. In fact, the study reported that workforce growth could be hampered by learning curve pains and significant age-related worker attrition occurring in the industry. Time appears to have confirmed these predictions.

The fate of utility job numbers vis-a-vis the smart grid now squarely rests with utilities themselves. It's fair to say we are past the learning curve pains and training shortages reported in the IIT study. Further, one industry assessment after another predicts high growth for the foreseeable future in smart grid technologies and their applications. The only question is will utilities seize on the new opportunities presented by this transformation and hire the employees needed to pursue them, or allow third-party businesses to fill the void?

Continue reading
  224 Hits
224 Hits
Oct
16

Effective Electrical Safety Comes Down to Two Factors

This technical paper on effective workplace electrical safety details the critical question that those responsible for safety must ask.

Click here for the PDF

P3 strives to bring you quality relevant industry related news.

See the origial article at: http://www.ecmweb.com/whitepapers/effective-electrical-safety-comes-down-two-factors?partnerref=UM_ECM_safetyTag_Oct17WP_001&utm_rid=CPG04000000918978&utm_campaign=16736&utm_medium=email&elq2=2e204a1c33634bd5a831539ab25d51f2

Continue reading
  1550 Hits
1550 Hits
Feb
29

Power Quality Measurement and Analysis Basics

Power Quality Measurement and Analysis Basics

Mar 1, 2012 Randy Barnett, NTT Workforce Development Institute | Electrical Construction and Maintenance

How to interpret the results of a power quality site survey

Analyzing electrical parameters associated with distributing electricity is viewed by many as complex engineering work. Yet, for engineers, electricians, and technicians troubleshooting equipment problems these days — and for contractors maintaining electrical systems they may have once installed — measuring power quality is becoming as much of a necessity as using the clamp ammeter to find out why the overloaded circuits keep tripping.

When any electrical system fails to meet its purpose, it is time to investigate the problem, find the cause, and initiate corrective action. The purpose of the electrical distribution system is to support proper operation of the loads. When a load does not operate properly, the quality of the electric power in the system should be suspected as one possible cause. Whether it’s used for troubleshooting purposes or to obtain baseline data, measuring/analyzing electrical system parameters is called power quality analysis.

The setup and use of power quality equipment — and obtaining and interpreting usable data — can be intimidating for those not familiar with the process. The key to success is knowing where and how to measure as well as how to interpret the results.

203ecmIPQpic1
Organization and planning is key to success. Dedicating an equipment cart to hold analyzers, test equipment, drawings, manuals, notebook, digital camera, and safety equipment can help.

Measurement Tools

Several measurement tools are available for power quality measurement. Power quality analyzers are the most commonly used tools to observe real-time readings and also collect data for downloading to computers for analysis. While some are permanently installed in the distribution system, handheld analyzers are necessary for many applications, especially troubleshooting.

Handheld power quality analyzers are fairly lightweight (generally 4 lb to 5 lb) and will measure a variety of parameters. The most typical include voltage, amperage, frequency, dips (sags) and swells in voltage values, power factor, harmonic currents, and the resulting distortion and crest factor, power and energy, voltage and current unbalance, inrush current values, and light flicker. If an analyzer measures and records such basic parameters, you can address most power quality issues successfully.

Portable data loggers typically monitor many of the same parameters as the power quality analyzer; however, they are meant for long-term recording (days to several weeks). In addition, the data logger does not typically provide the real-time values on-screen that an analyzer can provide. Additional test equipment, such as scopemeters and recording digital multimeters, also find specific use applications.

The Process

Conducting a power quality survey begins with planning. Simply determine the purpose of the survey, and write it down in a notebook or binder that will be used throughout the process to organize and maintain data. Start with a good one-line diagram of the facility electrical distribution system. If one does not exist, then this is an excellent time to get one up to date.

If conducting a general power quality survey to obtain baseline data for future comparisons — or to help identify any immediate hidden electrical distribution problems that may exist — start monitoring as close as practical at the point of service. Beware, however, measuring near the service typically means large amounts of fault current available. Therefore, be careful when connecting the analyzer at a point in the distribution system downstream of the main breaker that limits incident energy levels to acceptable values. Because power quality problems can either come from the electric utility — or be generated within the facility — be sure to contact the utility in order to identify any possible issues on this side of the meter.

Inside the facility, continue to “drill down” into the distribution system following the one-line diagram. Obtain data at the source of each separately derived system. For example, take recordings at the first panelboard or switchboard after a 480V to 208Y/120V transformer. Be sure to mark up drawings, and take plenty of notes for future reference.

Digital cameras work well for quickly capturing nameplate data and later identifying exact connection locations. Note plant conditions and any equipment that was running. Print out digital pictures, and maintain all data for the survey in the notebook binder. These notes will become valuable when analyzing data and conducting further studies.

Follow manufacturer’s instructions for connecting and setting up the analyzer. Because of the amount of test equipment and supporting documentation that is needed, it is often best to have an equipment cart dedicated for power quality work. In addition to technical expertise, the underlying key to a successful survey is planning and organization. Three common mistakes when connecting power quality analyzers are:

  1. Failure to observe current polarity. Make sure the arrow on current clamps points toward the load. If the arrow points in the wrong direction, a negative current value is obtained on the analyzer for that phase.
  2. Not matching current/voltage probes. If analyzer input phase “A” is clipped onto phase “B,” it is obvious readings will be erroneous. Color code individual leads such that voltage and current leads for each phase are the same color, and connect carefully to prevent such errors.
  3. Relying on battery power to complete a lengthy monitoring session. While fully charged analyzer batteries are meant to last hours, nothing is more frustrating than to find key power quality events were not recorded because the analyzer shut down. Be sure to keep the analyzer plugged into an AC source for recording parameters when you will be away from the equipment.

Analyzing the Data

Whether observing values real-time on the analyzer color screen or analyzing downloaded data on the laptop back in the shop, an understanding of power quality parameters and their characteristics must be understood. IEEE Power Quality Standards and NFPA 70B are excellent resources to help understand power quality terminology, issues, and corrective actions. To help with data analysis, each manufacturer provides software for its specific test equipment. Here is what to look for when analyzing data:

If experiencing overheating of neutrals, overheating of transformers or motors, nuisance tripping of circuit breakers, blown fuses, unusual audible noise in larger distribution equipment, or if distorted voltage sine waves are found, then suspect harmonics. The magnitude of the various harmonic frequencies and the amount of total harmonic distortion created by the harmonics are the critical factors to determine the severity and correction techniques for any harmonic problem. Measure harmonics at their source, (e.g., VFD, UPS), and expect them to lessen further upstream from the equipment. Sine wave distortion is a good indicator that you should analyze harmonics values (Figures 1, 2, and 3).

Fig 1
Fig. 1. While performing a power quality survey in a commercial office, distortion of the current sine wave on phase “C” at a panelboard indicated nonlinear loads and potential harmonic problems.

Transients are extremely short-duration voltage surges, sometimes incorrectly called “spikes.” The voltage levels achieved during a transient can cause equipment problems ranging from malfunction to destruction. If you’re experiencing unusual insulation failures, record data for extended periods at the equipment. The most severe transients are often caused by nearby lightning strikes. However, they can also be the result of switching of loads.

Voltage sags and swells are the most common type of power quality culprits. While definitions provide specific numbers for magnitude and duration of changes up or down in voltage values, the bottom line is changes of 10% or more in either direction from normal voltage can cause problems. These conditions only need to last from ½ cycle to 1 min. Too high a voltage (swell) can occur when large loads are dropped off the line. Sags, the decrease in voltage, are typically more bothersome and can cause contactors and relays to chatter or drop out completely. Equipment such as PLCs and variable-speed drives can malfunction, and computers may lock up. Observe voltage recordings for sags and swells, and try to relate these variations to changes in plant conditions or operations, (e.g., a chiller or other large load cycling off or on).

Fig 2
Fig. 2. Switching the analyzer to the harmonic function found indications of primarily 3rd and 5th harmonics (180 Hz) on phase “C.” These harmonics can distort the voltage sine wave causing mis-operation of equipment and increasing heat on the neutral conductor — and in motor and transformer windings.

Voltage unbalance between phases on a 3-phase motor can cause current values to reach six to 10 times the value of the voltage unbalance. Because current causes heat — and overheating is one of the leading causes of motor failure — distribution systems should be monitored for unbalance. Unbalance is often the result of single-phase loads cycling off and on, so monitor for unbalance at panelboards and switchboards throughout a typical plant cycle.

Fig 3
Fig. 3. The concern is that the harmonic currents may severely distort the voltage sine wave causing distribution system problems. A normal crest factor (CF) should read 1.41 (Vpeak ÷ Vrms). Here, phase “C” voltage crest factor is 1.47, slightly higher than normal. The crest factor for amperage on phase “C” is 2.09.

The key to success in power quality measurement and analysis can be attributed to success in three key areas. Set goals and plan the survey by reviewing one-line diagrams to determine points to monitor. Learn the functions and features of the test equipment and how to use it to capture the needed values. Finally, know what to look for while observing data whether in the field or after it is downloaded to the computer. Learning how to successfully measure electrical parameters associated with proper operation of equipment is obviously a key step in solving power quality issues.

P3 strives to bring you quality relevant industry related news.
See the origial article at: http://ecmweb.com/power-quality/power-quality-measurement-and-analysis-basics

Continue reading
  2694 Hits
2694 Hits
apc confidence eaton dependable mge experience rm integrity schneider reliability apc confidence eaton dependable mge experience rm integrity schneider reliability apc confidence eaton dependable mge experience rm integrity schneider reliability