Data sensors, out with the old and in with the new 

Sensors of the past that are still put to use today are often always on, and it can be challenging to collect, store and interpret the tremendous amount of data they create. The Internet of Things (IoT) is making it easier for information to be collected and analyzed. The IoT is an interconnection via the Internet from computing devices which are embedded in objects so they can send and receive data.

The sensors of today come with an analytics system which can help by integrating event-monitoring, storage and analytics software. The system on a data sensor has three parts: the sensors that monitor events in real-time, a scalable data store and an analytics engine.

Sensors have improved in capability, efficiency and cost and this allows organizations to be more aware and empowered and to intelligently react to factors such as past performance metrics, configuration and calibration conditions, input-to-output rates, predicted failure intervals and environmental impact.

In several cities throughout the U.S. this technology is playing an important role in improving the quality of life of citizens, enhancing government transparency and trust and improving environmental and economic sustainability. This is particularly true in cities where budgets are constrained and population growth rates continue to rise.

The city of Las Vegas is installing a traffic-monitoring system that uses technology to help determine how well vehicles are moving and monitors the state of traffic signals. Sensors will be installed at 2,300 intersections and across the region’s multi-jurisdiction corridors to provide the city and drivers a better perspective on traffic.

The city will be able to monitor sensors from their traffic control center where engineers can change traffic-signal timing, check various streets and intersections and analyze trends in real-time. The sensors are also equipped to communicate with autonomous cars. These vehicles will have access to real-time traffic light data so they know when to stop or slow down. The system can also tell cars and drivers the best speed along a stretch of road to ensure that they can proceed through the maximum number of green lights.

While Las Vegas attempts to ease traffic, Chicago is calculating its rainwater through a new pilot project that combines sensors and cloud computing. Sensors are already in place at three locations to measure rainwater running downhill.

The tool is aimed to reduce urban flooding and prevent millions of dollars in subsequent property damage. These sensors can record, among other things, precipitation amounts, humidity levels, soil moisture measurements, air pressure levels, and chemical absorption rates. Planners and engineers in Chicago hope to collect data that will help them produce and manage green infrastructure.

Texas is also keeping track of its water levels to better manage the flood plain along the Colorado River basin. The Lower Colorado River Authority built a network of 275 connected river sensors, called Hydromet. The sensors provide near-real-time data on stream flow, river stage, rainfall totals, temperature and humidity.

In July, LCRA received a $650,000 contract from the U.S. Department of Homeland Security to investigate better sensor technologies and software needed to relay information and alerts during a flood. The goal is to find high-tech sensors at a reasonable cost that can be rugged enough to last in outdoor conditions.

Another goal for LCRA is to have sensors that might be able to help emergency responders geo-target the smartphones of Texans who live in areas where flooding is likely to occur.


SPI’s newsletters are excellent sources of news for contracting opportunities nationwide. Subscribe here.