Join us for virtual technical sessions on May 18, 2022 brought to you in partnership with our industry partners. This year our event will focus on multimodal sensors and detection with the goal of these sessions to provide attendees:
- Solutions for transportation issues and their applicability to our local environment
- Shared lessons learned
- Information about existing and new technology
- Opportunities to have technical questions answered by qualified representatives
The schedule for the event is as follows:
3:00 PM Welcome
3:10 PM Exhibitor introductions and project highlights
3:45 PM Go to breakout rooms for presentations
4:45 PM Return to main room for closing remarks
5:00 PM End of Session
Registration for this event is free for ITE members and agency members with a nominal fee for everyone else. When you register please select the appropriate category based on our ITE International membership and/or employer.
Register for the event through Eventbrite here.
Sessions will be held concurrently using breakrooms. Feel free to visit one or all of our presentations!
Breakout Room 1 (Presentation)
Project: Wavetronix Intersection detection – Tampa, FL city-wide upgrade project
Presenters: Tim Janes (ATP), John Benson (Wavetronix)
Description: After an employee death while installing in-ground detection, Tampa went looking for a better, above ground alternative. After testing multiple devices and closely observing results, they chose Wavetronix for it’s low maintenance and ability to work accurately in all weather conditions.
Breakout Room 2 (Presentation)
Project: Lakeland, FL – Intersection Collision Avoidance Safety Program (iCASP)
Presenter: Carly Randazzo
Description: The problem in Lakeland, FL was that 28% of crash deaths occurred at signalized intersections as a result of red light running. The goal of the city was to reduce the amount of red light running crashes. The city installed Iteris’ red-light sensors at 11 intersections along with advanced traffic signal controllers to collect the high resolution data through the video and radar detection. Within the controller, they set up advanced safety applications to monitor vehicles and their speeds in dilemma zone. Red-light camera coverage captured video of crashes and infractions for six months. The results show that 48 motorists per day still run the red light after the green light has gone on and some drivers were shooting through red lights up to 14 seconds after oncoming traffic had a green light. Red-Light Running Sensors were then deploying our all over Lakeland, FL to help reduce red light running crashes by identifying vehicles likely to run red-lights and holding the all red to avoid a collision or extend yellow to allow the vehicle to travel through the dilemma zone.
Breakout Room 3 (Presentation)
Project: Seattle, WA (SDOT) – Mean Absolute Error (MAE) & Root Mean Squared Error (RMSE) Study: How GRIDSMART scored highest during Seattle’s test & why Seattle has deployed 70+ GRIDSMART systems
Presenters: Aron McEvoy, Nick Coffey, & Jeffrey Conor (City of Seattle)
Description: This study helped SDOT select GRIDSMART for the University of Washington MICMA project (Multimodal Integrated Corridor Mobility For All) for 40 more intersections around UW with bike and PED applications.
Problem: What above-ground detection system was the most reliable and scalable versus loops for city-wide deployment at SDOT’s 1000+ signals?
Solution: MAE & RMSE Data analysis indicates GRIDSMART. GRIDSMART is the only detection solution to utilize military-type motion jpeg tracking. This solution is not video or radar but instead, hundreds of high-resolution jpeg images taken each minute then analyzed. This technology uses a constantly improving algorithm with machine learning and AI to track and detect vehicles, bikes and PEDs. GRIDSMART was the first to market with a single cable/single camera (CAT5 or Fiber) and is the easiest, fastest, and most intuitive system with a 4-8 hour set up time with only 1 (or 2) sensors per signal. Seattle is not alone in recognizing as you scale across hundreds of intersections that 1 senor (or 2) versus 4 (or more) sensors make a significant difference in lowering total-cost-of-ownership and ongoing maintenance costs. We continue to win the hearts of signal technicians who love the low-cost and low maintenance nature of 1 (or 2) accurate & reliable sensor(s) versus 4 or more. San Francisco, Los Angeles, Tacoma and many other agencies (large & small) have selected GRIDSMART for these reasons. GRIDSMART is now deployed in 49 states and 29 countries.
Breakout Room 4 (Presentation)
Project: Phoenix, AZ – Using Artificial Intelligence in Detection and Adaptive Signal Control Technology
Presenters: Justin Effinger & Tom Cooper
Description: NoTraffic implemented AI-based sensors in Arizona for NoTraffic’s adaptive signal control technology (ASCT). The data collected from the fused radar/video AI-based sensors provide a better and more reliable dataset that allows more ASCT flexibility which leads to reductions in delay and emissions and can help with competing objectives.
Breakout Room 5 (Presentation)
Project: City of Portland, Maine – Continuous Improvement of AI
Presenter: Joel Quigley
Description: Miovision is continuously sampling performance at intersections across North America, having that performance tested and validated by our internal teams. When we find situations where performance is not meeting expectations, samples are captured and added to our next training iteration, contributing to improved performance when the device receives its next-generation model. This was seen working with the City of Portland, Maine. The Miovision TrafficLink platform was installed at a large and complex intersection in Portland.
Once installed, detection accuracy was measured at 97.3%. While this wasn’t an ideal score, as it did not quite meet the 98% TERL detection certification, it was a good starting point. Through our continuous improvement model, this same intersection, using the exact same hardware, with a simple automatic software update, experienced an improved accuracy score of 98.3%. Upon the next release, later, we were able to once again increase the accuracy, bringing the intersection to 98.4%.
See more about the project here:
Breakout Room 6 (Presentation)
Project: City of Thousand Oaks, CA – Red Light Extension – Autoscope Vision HD Video Detection
Presenters: Jason Kohl & Vikas Manocha
Description: Utilizing an industry-first HD-3D algorithm, Econolite’s multimodal sensors incorporate deep learning AI to provide enhanced detection in all facets at the intersection. The City of Thousand Oaks, CA utilized Autoscope Vision’s speed threshold detection to provide all-red extension for vehicles traveling above a user-defined speed to allow safe passage through intersections when necessary. After an initial pilot deployment at one intersection, this City has expanded the application to additional intersections for added safety.