When a Senior Leader Dismisses Python, That's the Signal

There's a recurring conversation in Hong Kong's tech leadership circles where someone senior dismisses Python as a toy language. The dismissal tells you more about the speaker than the language — and the implications run deeper than a syntax debate.

There’s a recurring conversation in Hong Kong’s tech and business leadership circles. Someone senior — a CIO, a CTO, occasionally a board-level voice — waves their hand at Python and calls it a toy language. Entry-level. Not serious.

It’s said with the confidence of someone who has earned the right to have opinions. And it’s wrong in a way that tells you more about the speaker than the language.

The view from 2010 was coherent

To be fair, the criticism had a basis once. Python was slow. Its concurrency model had real constraints. For production systems carrying enterprise-grade load — banking infrastructure, telco platforms, large-scale ERP — Java and C++ were reasonable defaults, and Python genuinely occupied a scripting or prototyping role.

That was a decade and a half ago.

What Python’s stack actually looks like now

The production systems running the most commercially significant infrastructure in the world are heavily Python.

  • Machine learning and AI infrastructure: PyTorch, TensorFlow, the Anthropic and OpenAI SDKs — all Python-primary
  • Data pipelines at scale: Apache Airflow, dbt, Pandas, Polars — Python throughout
  • Backend APIs: FastAPI has benchmarks competitive with Node.js and outperforms many Java frameworks on equivalent workloads
  • Automation and orchestration: the dominant language across every serious workflow tool — n8n Python nodes, LangChain, CrewAI, and the agentic AI layer being built right now across the industry

Google, Meta, Stripe, Anthropic — none of these organisations treat Python as a toy. They treat it as infrastructure.

The CIO dismissing Python in 2026 is doing so from a mental model built in a different era, applied without updating to a stack that no longer resembles what they remember.

Stacked comparison of where Python sits in modern infrastructure across machine learning, data pipelines, backend APIs, and AI agentic tooling — showing Python as load-bearing across every layer of the AI-native stack.
The 2010 stack and the 2026 stack are not the same conversation.

The language is not the problem

Here’s the structural point worth sitting with: a senior technology leader’s job is not to have a favourite language. It is to match tools to requirements — performance profile, team competency, ecosystem maturity, time-to-ship, and long-term maintainability.

Python is the right choice in many contexts. It is not the right choice in others. A systems-level component with hard latency requirements might call for Go or Rust. A high-concurrency financial matching engine might still want C++. The decision framework is requirement-driven, not identity-driven.

When a leader dismisses an entire language categorically, they are not demonstrating engineering sophistication. They are demonstrating a preference calcified into a position. That distinction matters when the same leader is making hiring calls, architectural decisions, and build-versus-buy judgements across a technology organisation.

The AI era makes this more acute

There is a second layer to this that did not exist five years ago.

In agentic AI development — building systems where AI models execute tasks, call tools, manage memory, and orchestrate multi-step workflows — Python is not incidental. It is the substrate. The entire open-source AI tooling ecosystem is Python-first. If your organisation is building anything on top of LLMs in 2026, your team needs Python fluency.

A leader who has decided Python is beneath serious consideration is, practically speaking, deciding to sit out one of the most commercially significant technology shifts of the decade.

What this looks like in practice

At V8 Global, we build agentic systems for commercial clients — CRM pipelines, autonomous outreach workflows, AI-assisted business operating infrastructure. Python is part of the stack where it fits. So are other tools where they fit better.

The methodology is Scaffold: human-in-at-decisions, AI-in-at-execution. Language selection is an architectural call, made by requirement. The question we ask is never “is this language prestigious?” It is “does this tool do the job, is it maintainable, and can our system grow with it?”

That is the question every technology leader should be asking.

The takeaway

If you hear a senior leader dismiss Python as a toy language, that is not a data point about Python. It is a data point about how that leader updates their mental models.

In a technology environment moving as fast as this one, the ability to update is not a soft skill. It is a core leadership function.

The language debate is a proxy for something more important: whether the person making technology decisions is reasoning from current evidence, or from a position they formed a long time ago and never revisited.

That distinction has commercial consequences.


Alan Law is founder of V8 Global and architect of Axia. Leadership Insight posts examine the structural decisions behind AI-native commercial systems. For a working conversation about your stack, start here.

Axia

Ready to take the next step?

Join London's executive AI community — events, practical intelligence, and curated introductions for established business leaders.

How V8 selects technology