Operations Calling is Tulip’s flagship event for the global manufacturing community — a space where leaders, engineers, and innovators come together to exchange ideas and see the future of operations unfold. The 2025 edition gathered more than 750 attendees from across industries for two days of keynotes, workshops, and live demos exploring how AI, composability, and human ingenuity are defining a new era of connected, intelligent operations.
In the session Building Smarter Factories: The AI Ecosystem Advantage, experts from Tulip, NVIDIA, and DeepHow explored how collaborative innovation is shaping the next generation of industrial systems. Moderated by Eddy Atkins, Tulip’s Lead of Strategic Alliances for EMEA, the discussion featured Siva Lakshmanan, CEO of DeepHow, and Alvin Clark, Global Developer Relations Manager at NVIDIA.
Together, they made a compelling case for the power of ecosystems — arguing that AI in manufacturing cannot advance through isolated technologies. It takes open collaboration between infrastructure providers, software platforms, and application developers to make AI scalable, reliable, and useful where it matters most: on the factory floor.
Industrial AI is rarely a one-company story. Each speaker approached the problem from a different layer of the stack — infrastructure, software, and user experience — yet all converged on a common idea: innovation now happens between systems, not within them.
Eddy framed this shift through Tulip’s long-standing commitment to openness. Manufacturing systems, he explained, have historically been built as closed silos that trap data and limit flexibility. The future requires a different approach — one where each component of the industrial tech stack is interoperable by design.
Alvin described NVIDIA’s role in this architecture as foundational, providing the computing layer that enables partners like Tulip and DeepHow to deploy AI models across edge devices and cloud environments. Siva emphasized that this openness is what allows specialized companies like DeepHow to contribute domain-specific intelligence — in his case, capturing human expertise through AI-driven video training.
The conversation reinforced a central truth: the future of AI in manufacturing will not be defined by competition between vendors, but by collaboration across ecosystems.
Edge computing emerged as one of the panel’s recurring themes. Alvin explained that real-time responsiveness requires processing power as close to the work as possible. NVIDIA’s accelerated computing stack — from GPUs to embedded Jetson devices — allows vision models and AI agents to run at the edge, minimizing latency and improving reliability.
“Industrial environments don’t have the luxury of delay,” he noted. “When you’re dealing with inspection, robotics, or process control, every millisecond counts.”
Siva added that this principle also applies to people. DeepHow’s AI models capture and process human knowledge directly at the workstation, turning tacit expertise into structured training modules in real time. Operators record a task, and the system instantly identifies key steps, errors, and improvement opportunities.
Eddy connected these perspectives through the lens of interoperability. When AI insights are generated at the edge — whether by machines or people — they must be easily consumable by the broader ecosystem. Composable platforms like Tulip make it possible to route those insights back into production workflows, creating a continuous feedback loop between data, context, and action.
The panelists agreed that the most significant barrier to scaling AI isn’t data scarcity but data meaning. Siva summed it up succinctly: “AI without context is just math.”
DeepHow’s models are designed to give AI that missing context — translating unstructured video and voice data into labeled, searchable insights that represent how work actually happens. Alvin expanded on this by explaining how contextual data improves AI reasoning. The more a system understands about environment, materials, and operator intent, the more accurately it can predict outcomes.
Eddy tied these threads back to Tulip’s composable data model, where every action — human, machine, or sensor-driven — is part of a shared schema. This approach allows engineers to build workflows that reflect the true state of production rather than abstracted analytics.
In an industry where visibility often stops at dashboards, context-aware AI is what bridges the gap between analysis and execution. It transforms fragmented data into operational intelligence — the foundation for continuous improvement.
As the discussion evolved, the speakers turned their focus to how industrial software itself is changing. The next generation of tools, Alvin explained, will be built less as standalone applications and more as components within a distributed ecosystem.
He highlighted NVIDIA’s TAO Toolkit and Omniverse platforms as examples of this approach. Developers can train, test, and deploy industrial AI models without needing to build infrastructure from scratch. Each partner adds value through specialization — simulation, data ingestion, workflow design — connected through shared standards and APIs.
Eddy observed that this modular philosophy mirrors Tulip’s own evolution toward composable operations. “We’re seeing the industrial stack become more like the web stack,” he noted. “Interconnected services, lightweight integrations, and flexibility at every layer.”
Siva agreed, pointing to DeepHow’s seamless embedding into other systems as proof that the future of manufacturing software lies in interoperability, not ownership. When tools communicate through open frameworks, innovation accelerates across the ecosystem instead of competing within it.
The conversation concluded with a look at what’s next — the emergence of agentic AI systems capable of collaborating across both digital and physical domains.
Siva described a near future where instructional AI, like DeepHow’s training agents, will interact directly with operational AI running in Tulip or NVIDIA-driven environments. One guides people; the other optimizes processes. Together, they create a closed loop between learning and doing.
Alvin pointed out that realizing this vision requires careful orchestration. Agents will need to operate within shared governance frameworks that define how data flows and decisions are made. “Autonomy without interoperability isn’t progress,” he observed. “It’s chaos.”
Eddy echoed this sentiment, noting that as agentic systems mature, success will hinge on collaboration — not only between humans and machines but also between the platforms that enable them. The next phase of manufacturing intelligence will depend on aligning those systems toward a common purpose: amplifying human capability at scale.
Building Smarter Factories captured what many in the audience described as the most technically grounded conversation of Operations Calling 2025. Rather than focusing on AI as a product or a trend, it explored AI as an architecture — a living network of tools, models, and people working in sync.
The message was clear: the future of manufacturing belongs to open ecosystems, where innovation is shared, intelligence is contextual, and progress is collaborative by design.
🎥 Watch this session and all other Operations Calling 2025 presentations on demand.