Re/Cappers! Pleasure and privilege for RCN to be back with you.
CES lit up the new year like potential aliens did New Jersey, so it’s a motif of this issue. While it isn’t a reality capture conference per se, many of its technologies are at minimum neighbors, at most partners.
So, when did the Consumer Electronics Show start consuming us consumers so much, boasting bots and AI that could consume us were they to consume the wrong training algorithm?
1967, that’s when! And technically, it’s not even original. Like PayPal from eBay, CES was actually a spinoff when it launched in the time of tie-dye; The Chicago Music Show was king of the tech-reveal hill until CES and its inaugural 14 exhibitors went meteoric.
‘67’s big announcement? The pocket radio! Take The Beatles with you on the go! In your pants!
Computing, reality capture roots, and other landmark tech weren’t far off. ‘78 was CES’ first rodeo in Las Vegas, which hit a royal flush with the unveiling of the home computer. ‘85 brought Nintendo, and with it, childhood 2.0.
Some fledgling operation named Apple crashed the party in 1992 and pretty much MIA’d after. The first digital satellite system blessed 1994, and robotics defined 2005, when a Roomba swept away everyone en route to winning a “Best of” Innovations Award. VR virtually blew up in 2016, not long after LiDAR was detecting a range of applications.
Those catalysts led to last week, where eye-popping displays of radiance fields, digital twins, XR, ocean-cleaning robot turtles, solid-state LiDAR, spherical drones, and other feats will themselves be catalysts. For what? Who knows, but you are dared to not be excited.
Because it will make today…look like CES #1:
Much more on CES awaits, after coverage of reality capture memorializing Jimmy Carter.
What’s Cappenin’ This Week: Jimmy Carter’s youth is preserved through reality capture, CES brings industrial its AI/robot/digital twin moment, spacecraft lose weight thanks to photogrammetry, a geospatial roundtable maps the past and future, and an AEC Error of the Week packed with modern wisdom…from Ancient Rome
Mini ‘Cappenins: NVIDIA CEO Huang takes radiance fields to the big stage, Dell enriches its AI PC portfolio, digital twins drive individualized health care, 17 weird/wonderful/terrifying robots grace CES, UAV & orthophoto data assess coastal cliff erosion, and a Hexagon acquisition
Last issue: Photogrammetry-to-XR gives an all-access pass to art museums, a capture company does Oscar-caliber jobs, BIM drives Notre-Dame’s reopening, the world gets its first 5G 360° camera, and an AEC Error of the Week featuring Kentucky Straight (Down) Bourbon Whiskey
Last October, we Re/Capped a story on the preservation of Jimmy Carter’s virtuosic woodwork. Today, we’re honored to Re/Cap the digital preservation of his Depression-era youth, and the man it manifested.
We can thank the ingenuity of a research team at The University of South Florida’s Center for Digital Heritage and Geospatial Information, in tandem with the National Park Service. They’ve created stunning virtual tours of the preeminent peanut farmer’s boyhood home, high school, and campaign headquarters, capturing intricate details of the sites and 25 furniture pieces he personally crafted.
The project forms digital twins of historic spaces with sub-millimetric precision, making these locations accessible to people worldwide. It’s a feat which not only preserves consequential physical spaces, but also tells a poignant origin story of Carter’s multifaceted life as a politician, humanitarian, woodworker, and community leader. A glimpse of the tour, within the exhaustive project overview, can be enjoyed below.
Software may have penetrated the analysis stage and, to a lesser extent, the design stage of facility management. But the actual operations have largely remained stagnant, what with manual, tedious, often wasteful workflows.
Which is why CES was more like YES for the industrial market, after NVIDIA announced a strategic partnership with Accenture and shrewd supply chainers KION Group. Its implications?
Mega. Literally.
Mega is an “Omniverse Blueprint and for developing, testing and optimizing physical AI and robot fleets at scale in a digital twin before deployment into real-world facilities.” KION Group is teaming with Accenture and NVIDIA as the forerunners of Mega implementation, specifically for parcel services, consumer packaged goods, and retail. NVIDIA lays out the roadmap and tech below, including how the Omniverse Cloud Sensor RTX API refines results.
If spacecraft want to depart our atmosphere, then thrive in space, they gotta be as lean as possible, down to the ounce.
Well, a recent discovery of a thin membrane as new shuttle material is cause for shed-the-pounds excitement, but there’s a catch. Its plastic-wrappiness can wrinkle, meaning performance can suffer, meaning our beloved astronauts could be confronted with another week of freeze-dried meatloaf + ketchup.
Now, what on EARTH could POSSIBLY be used for defect detection? AH!
Engineering Professor Takashi Iwasa of Osaka Metropolitan led a team in developing a photogrammetry-based method for measuring any wrinkle’s size, thanks to tension-field theory. Phys.org has got your broad strokes on the measurement point calibration here, or you can opt for grad school with the full scientific paper published on ScienceDirect.
At least per Moore’s Law, 2014 was in fact a long time ago. The iPhone 6 was the device du jour. Vine was a goldmine. And the brand new Xbox One and PS4 were sucking up more time than that Interstellar scene in the water.
And looking back, the geospatial industry had similar hallmarks, which served as the anchors of a past-and-future-themed roundtable assembled by GIM International.
The meeting of the minds revolved around six questions, spanning how far we’ve come, where we are, and where we’re going. The distinguished experts included Linda Foster, strategic visionary at Esri, Jonathan Murphy, CEO of GoGeomatics Canada, Chris Trevillian, director of product go-to-market geospatial at Trimble, and Lee Hellen, CEO and founder of Kurloo Technology. Enjoy passionate chit chat on uncrewed platforms & survey instruments, hydrography, data processing, Earth observation, market demands, user interfaces, and some juicy-yet-informed predictions.
In 27 CE, amidst the reign of Emperor Tiberius, the ancient Roman town of Fidenae endured one of the most catastrophic structural failures in history.
A hastily constructed wooden amphitheater, built by a freedman named Atylius, collapsed during a gladiatorial contest, resulting in an unprecedented loss of life that to this day holds the Guinness World Record for worst sporting disaster; ancient historians Suetonius and Tacitus report between 20,000 and 50,000 people, respectively, were killed or seriously injured.
A host of factors contributed to this calamity:
Rushed construction: Atylius, eager to capitalize on the lifting of a ban on gladiatorial games, erected the amphitheater with great haste and little focus.
Poor foundations: Tacitus noted that the structure lacked solid foundations, failing to reach bedrock. What’s more, the wooden framework was poorly jointed, unable to withstand the collective weight of the spectators. The operative word here is “collective,” because, well…
Overcrowding: The event was a draw and a half, with swarms of people inside and outside the amphitheater.
The collapse was sudden and catastrophic, with eyewitness accounts describing the structure both falling inwards and collapsing outwards.
In the aftermath, Emperor Tiberius personally oversaw rescue efforts. The Roman Senate enacted new regulations to prevent future disasters, including requiring sound foundations for amphitheaters and mandatory safety inspections.
Recent scholarship has shed new light on this ancient tragedy. A digital reconstruction by Rebecca Napolitano estimates the amphitheater’s seating capacity at around 37,400, providing a clearer picture of the structure’s enormity.
At the core of the Fidenai debacle was cost-cutting negligence – and what’s more timeless than that in select humans? While the technology of ancient Rome couldn’t have prevented this tragedy, modern reality capture techniques could have been bona fide difference-makers.
Laser scanning technology could have created precise 3D point clouds of the amphitheater’s structure, allowing for detailed analysis of its geometry and potential weak points. Photogrammetry techniques could have generated high-resolution 3D models from multiple angles, capturing intricate details of the wooden framework and jointing. These could have been parlayed into a comprehensive Building Information Model, integrating structural data with material properties and load calculations.
Metrology tools could have provided accurate measurements of the foundations and overall structure, ensuring adherence to design specifications. A digital twin of the amphitheater, updated in real-time, could have simulated structural behavior under various load conditions, predicting potential failure points before they occurred. This virtual representation would have yielded informed decisions about reinforcement or redesign.
The Fidenae disaster demonstrates that even in antiquity, the consequences of neglecting structural integrity could be devastating – to the point of being studied for millennia.
As we continue to build ever more ambitious structures, the lessons from this ancient catastrophe remain profoundly relevant, underscoring the critical role of meticulous planning, quality materials, and rigorous safety standards in construction – all drastically eased through reality capture.
By subscribing, you are agreeing to RCN’s Terms and Conditions of Use. To learn how RCN collects, uses, shares, and protects your personal data, please see RCN’s Privacy Policy.
Reality Capture Network • Copyright 2025 • All rights reserved