|
Last year, Transport for London tested AI-powered CCTV at Willesden Green tube station, running feeds through automated systems from October 2022 to September 2023. According to Wired, the goal was to detect fare evasion, aggressive gestures, and safety risks. Instead, the system generated more than 44,000 alertsnearly half of them false or misdirected. Children following parents through ticket barriers triggered fare-dodging alarms, and the algorithms struggled to distinguish folding bikes from standard ones. The impact was immediate: Staff faced 19,000-plus real-time alerts requiring manual review, not because problems existed, but because the AI could not distinguish between appearance and intent. Trained to watch motion and posture, not context, the system exposed a deeper flaw at the core of many AI tools today. As AI spreads into daily lifefrom shops to airportsits inability to interpret why we move, rather than simply how, risks turning ordinary human behavior into false alarms. The Limits of What Cameras Can See Most vision AI excels at spotting patterns: crossing a line, entering a zone, breaking routine. But nuance, ambiguity, and cultural variation trip them up. In dynamic or crowded environments, one of the biggest challenges is when people or objects block each other from view, says Tuan Le Anh, CEO of Vietnam-based Advanced Technology Innovations (ATIN). When people overlap or move quickly in low lighting, the system might merge them into one person or, worse, duplicate them. Its easy for cameras to miss key actions or misclassify whats going on entirely. That lack of context has real consequences. A person running could be exercising, fleeing danger, or chasing a bus, but AI sees only the act, not the reason. Most systems process brief visual fragments without factoring in time, crowd dynamics, or audio. They can say what is happeninglike someone runningbut not why, Le Anh notes. That lack of causal reasoning creates blind spots. In practice, this has led to retail cameras mistaking reaching motions for theft, public transit systems disproportionately flagging passengers of color, and healthcare monitors confusing routine gestures with signs of distresssometimes while missing genuine emergencies. Le Anh argues the solution lies in training AI to see the whole scene. When you combine multiple data sources and let the model learn from patterns over time, you get closer to something that understands intent, he says. Thats where this technology can stop making the same mistakes and start becoming truly useful. False Patterns, Real Consequences This problem reflects what Sagi Ben Moshe, CEO of Lumana, calls the pattern-matching trap. AI trained to classify pixels often latches on to surface details with no real meaning. One classic example came from military image-recognition projects, Ben Moshe tells Fast Company. They trained the system to detect tanks using photos that happened to be taken near trees. What happened is that the system learned to spot trees, not tanks. It worked great in testing, but failed in the field. Lumanafresh off a $40 million Series A funding round led by Wing Venture Capital and backed by Norwest and S Capitaldesigns video AI to avoid those pitfalls. Its continuous learning models track motion over time and in context. Theres a huge difference between seeing and understanding, Ben Moshe says. AI currently can detect a person, but it doesnt know if that person is distressed, distracted, or just waiting for a ride. And when systems act on that incomplete view, we risk misunderstanding becoming automated at scale. The risks are highest in schools, hospitals, and stadiumsplaces where safety depends on accurate classification, and false positives can cause escalation or missed threats. Lumanas approach integrates diverse data streams to reduce those errors. Why AI Needs Physics, Not Just Pixels Experts argue that real understanding requires more than 2D vision. AI must learn the same physical and spatial rules humans absorb as children: gravity, motion, cause, and effect. Todays AI vision systems are amazing at spotting patterns, but terrible at explaining why something is happening, Ben Moshe says. They dont have a built-in sense of physical logic. A toddler knows that if you push a ball, it rolls. An AI model doesnt, unless its seen millions of videos of balls rolling in similar ways. Industry efforts are moving in that direction. Lumana builds structured models of objects, forces, and scenes, while ATIN explores transformer-based vision and 3D scene graphs to capture depth and relational context. But high-resolution, real-time interpretation demands vast processing power. As Ben Moshe puts it, Not everyone can have an Nvidia H200 sitting in their building. Building AI That Understands As companies race to automate physical spaces, the stakes are clear: Unless AI learns context, we risk scaling human blind spots into automated ones. When you deploy AI that sees without understanding, you create systems that act with confidence but without context, Ben Moshe says. Thats a recipe for unfairness, distrust, and failure, especially in high-stakes environments. Ben Moshe and Le Anh agree: The future of AI wont hinge on sharper cameras or better labels, but on reasoninglinking movement to meaning and time to intent. If AI is to coexist with humans, it must first understand us and our world. Progress is happening, with models that integrate time, audio, and environmental cues. But real trust will depend on systems that are not only smarter but also transparent, interpretable, and aligned with human complexity. When that shift comes, AI wont just recognize a face or track motion, it will grasp the context behind it. And that opens the door to technology that doesnt just watch us, but works with us to create safer, fairer, more responsive public spaces.
Category:
E-Commerce
When the world stops making sense and everyone’s looking to you for answers, that’s when real leadership begins. I learned this in the most extreme of circumstancesfirst, as a SWAT team Tactical Commander where split-second decisions meant life or death, then as CEO of a major public company where market crises could make or break thousands of our customers’ livelihoods. The skills that kept our team alive in tactical operations are the same ones that helped steer our organization through economic downturns, industry disruptions, and unprecedented challenges. Crisis leadership isn’t about having all the answers; it’s about having the right framework to make decisive moves when the stakes are highest. The OODA Loop: Your Crisis Leadership Framework Air Force Colonel John Boyd developed the OODA Loop by studying why American F-86 fighter pilots dominated technically superior MiG fighters in the Korean War. The framework he created became the gold standard for decision-making in competitive, high-stakes environments, and remains standard methodology in tactical operations today. The loop takes you through four key steps: Observe: Rapidly gather information about the evolving situation Orient: Process observations against your experience and current reality Decide: Choose your course of action with incomplete information Act: Execute decisively while preparing for the next cycle Speeding through the cycle becomes your ultimate competitive advantage. The leader who can take these steps faster than the crisis evolves wins the game, or, in some cases, the dogfight. Observe Without Panic In SWAT operations, observation meant survival. You had to see everythingsuspect behavior, environmental hazards, team positioningwhile filtering out distractions under extreme stress. The same discipline applies in business crises. During business challenges, other industry leaders typically make reactive decisions based on incomplete observations, or worsefail to make decisions due to analysis paralysis. Instead, I apply Boyd’s observation principles: gather data systematically, look for patterns that others miss, and resist the urge to act before you truly understand what’s happening. Here’s the landmine: The moment you let emotions cloud your observation, you lose your competitive advantage. Orient Faster Than Your Competition Boyd believed orientation was the most critical phase, where you synthesize observations with experience and strategic context. In tactical operations, poor orientation gets people killed. In business, it gets companies killed. During the COVID pandemic, while I was CEO of RE/MAX, competitors were still trying to understand the paralyzed market. We had to orient to the new reality of Zoom showings, curbside closings, and overall new ways of doing business immediately. To combat this, I drew on my law enforcement experience of reading situations that didn’t match expectations. We recognized that the market continued even with different protocols, and we had to adjust how we did business. This rapid orientation gave us a significant market advantage, even in the hardest-hit areas. Companies that survive crises orient to new realities fastest and most accurately. Decide with Tactical Precision Boyd understood that having perfect information is a luxury you can’t afford. Whether breaching a door or entering a volatile market, you decide with 70% information and 100% commitment. When you have multiple options but incomplete data, apply the same process we used in high-risk operations: identify your primary objective, consider second-order effects, choose the option that advances your mission, then commit fully. The decision-making process builds confidence to take action. Waiting isn’t decision-making, and waiting for nonexistent information is a fool’s game. How would you feel if the SWAT team had no next step while you were the hostage in the building? Same thing in businessgather just enough information to lean toward a decision. In a crisis, the worst decision is usually no decision. Act While Others Hesitate Action without observation and orientation is reckless, but observation without action is worthless. Boyd’s framework only works when you complete the cycle, then immediately start the next one. During market shifts, our actions required balancing financial and operational oversight against financial challenges and legal restrictions. Crisis operation isn’t just about moving fast; it’s moving with purpose while staying flexible to changing rules that shift daily. I built what Boyd called “implicit guidance and control” into our systems. Our team knew the mission well enough to act independently when circumstances changed faster than communication could keep up. When offices and franchisees located around the globe called with questions, the decision framework was simple: “What’s right that aligns with the values of the business?” Elite performers cycle back to observation before competitors finish their first decision. Getting Inside Your Opponent’s OODA Loop Boyd’s real insight was “getting inside your opponent’s decision cycle.” By going through the OODA Loop faster than your competition, you make them react to your moves instead of executing their own strategy. At RE/MAX and other companies I oversee, we institutionalized rapid OODA cycling through what I call the 3-2-1 decision-making process. Instead of asking management questions and waiting for responses, I empower people to come up with 3 ideas to solve a problem, create 2 best options, then make 1 recommendation. This quick process gets action taken immediately. By the time our competitors made their first major move, we were already three moves ahead. Building Organizational OODA Capability You don’t develop OODA Loop mastery during the crisis; you must develop it beforehand, implementing steps like: Accelerate Observation Systems: Create information flows that give you earlier intelligence than competition, and host weekly clarity meetings for everyone aligned. Sharpen Orientation Through Training: Regularly run exercises like what-if games, so orientation becomes automatic during real disruptions. Practice Quick Decision-Making: Create safe environments for consequential decisions under pressure and build the understanding that it’s okay to make decisions. Build Rapid Execution Systems: Design processes that implement decisions faster than circumstances change. Communication Within the Loop Boyd understood the OODA Loop isn’t just individual, it’s organizational. Your team’s ability to share observations, align on orientation, coordinate decisions, and syncronize action determines collective speed. Maintain OODA coherence by applying tactical communication principles: Observe together Orient collectively Decide with clarity Act in coordination As you get used to “dancing the OODA” together, you’ll see instinctive decision-making perpetuate team success. The Ultimate Weapon Crisis leadership is more than just being fearless; it’s about being intentionally and strategically faster. Colonel Boyd’s OODA Loop that kept fighter pilots alive is the same framework that kept our businesses thriving during market downturns. Whether facing armed suspects or market volatility, the fundamentals remain the same. In competitive environments, the speed of decision-making becomes your ultimate weapon. Not reckless speed, but the disciplined speed that comes from mastering Boyd’s cycle. When your next crisis hits, the question isn’t whether you’ll face uncertainty and pressure; the question is whether you’ll cycle through your responses faster than the crisis itself can evolve. Your competition is counting on you to hesitate. Your team is counting on you to lead. Time to close the loop.
Category:
E-Commerce
Every company wants to have an AI strategy: A bold vision to do more with less. But theres a growing problemone that few executives want to say out loud. AI initiatives arent delivering the returns they were hoping for. In fact, many leaders now say they havent seen meaningful returns at all. IBM recently found that only 1 in 4 AI projects hit the expected ROI. And BCGs research goes further still: 75% of businesses have seen no tangible value from their AI investments. Stop buying tools your team doesnt know how to use The fix? Increase your investment in AI training to support your business transformation. The data tells a simple story. An Akkodis survey suggested only 55% of CTOs believe their executive teams have the AI fluency needed to grasp the risks and opportunities of the tech. Yet, it is these same executives who are trying to reengineer entire workflows, teams, and business models around tools that their people barely understand. And when performance disappoints, the knee-jerk reaction is to buy even more tech. More platforms. More licenses. More dashboards. But that only makes the problem worse. The teams that were struggling to learn one tool are now juggling five. Everyones overwhelmed. No ones effective. And adoption flatlines. Even if you have the most advanced tech in the world, if your team doesnt know how to use it effectively, its worthless. Expand your training budget But, equally, throwing money indiscriminately at AI education alone isnt going to fix the problem. The training investment must be smart. And that means implementing training programs that are truly pan-company and aligned with the business objectives. Too many businesses funnel their AI training into a tiny corner of their workforceusually just their IT, engineering, or data teams. And while these teams do need support, theyre not the ones who are going to deliver the productivity gains that you are trying to realize. That job falls to the rest of your company: the 90% working in frontline roles and business functions where the AI transformation will be felt most. Whether thats operations, strategy, product development, sales, finance, marketing, HR, legal, or customer service. These are the people who run your business. And if they dont know how to apply AI to their day-to-day work, your transformation will stall. If the goal is to modernize the business end to end, your training needs to reach end to end. Teach Data and AI literacy before you teach tools At the same time, surface-level AI training that focuses only on toolssuch as how to write a prompt, where to click, and how to navigate an interfacewill also fall short. Effective AI training needs to build capability and not breed dependency. The best results come when your people understand whats happening under the hood. Dont get me wrong, your team members dont all need a PhD in computer science. But they do need solid data literacy. They need to know how to interrogate, interpret, and act on data. The real value of data comes from understanding what it can actually doseeing its potential and seizing it with both hands. Without even the most basic data skills, AI will create beautiful spreadsheets that cant be acted on. And thats not the revolution anyone had in mind. Train your managers just as muchif not more Equally, when it comes to AI training, theres a myth I sometimes hear: Managers dont need AI training because they’re not doing the work. Their job is to manage the team or set the vision, not run the tools. But that logic falls apart quickly. Firstly, I can think of countless ways that AI can make managers more effective: being able to synthesise and extract lessons from performance data, providing their team with hands-on guidance on how to use AI, and spotting opportunities to reengineer workflows. But, more importantly, it is the bad message that not training your managers sends to your wider team. It runs the risk of your wider company writing off your transformation as “hot air” and “warm words” rather than concrete, in-the-trenches implementation. Wide-scale transformation needs managers who can lead by example. If you train the team but skip the managers, dont be surprised when nothing changes. Build a culture that lets people use what they learn Finally, even the best training program will fall flat if your workplace punishes people for using it. In many businesses, employees are quietly, and perhaps unconsciously, discouraged from using AI. Theres a genuine fear that if theyre seen to be using AI, they will be criticised for cutting corners or cheating. The result? Team members keep their heads down and go back to old habits. In other companies, colleagues are afraid to give AI a go in the first place. Theyre hamstrung by a fear that theyll make a mistake or get something wrong. In both cases, your training budget goes to waste. So, if you want this to work, you need to create a culture of experimentation and entrepreneurship, where trying something new is actively encouragedand not seen as a riskand where teams share learnings, trade prompts, and build real know-how together. Too many companies are pinning their hopes on the next big AI tool. But no tool, no matter how powerful, will move the needle if your people dont know how to use it. The smart move right now isnt just buying more software. Its training your people to work smarter with the tech you already have. Thats how you make AI worth the investment. Thats how you turn strategy into results. And thats what will, ultimately, stop your AI vision from dying on paper.
Category:
E-Commerce
All news |
||||||||||||||||||
|