Hello
Can someone please tell me why @Leadership have decided to go for GPU’s ?
This is a bad move
ASICS
Tpu’s
And then Photonics is the best way forward.
Hello
Can someone please tell me why @Leadership have decided to go for GPU’s ?
This is a bad move
ASICS
Tpu’s
And then Photonics is the best way forward.
I think they have a difference in opinion on adoption curve… a matter of early, too early, just right, or too late.
From a practical stance, in my opinion what you mentioned is too early.
In my opinion, GPU is abit late, maybe Im wrong. Maybe you’re wrong. Maybe they’re wrong, who knows.
My point is that Dfinity should focus on the future specifically future proofing the network.
By going down the same road as everyone else they are throwing away the competitive edge (tiny but an edge imo) …The edge is that they have not gone down the GPU route. It’s outdated technology not designed to run AI l, but rather modified to run it, super inefficient.
GPUs will be redundant by 2026~27. (The only people pushing GPUs are those who hold Nvidia shares- it’s not the best option for a few reasons)
I understand that due to current demand they are needed but rent them don’t buy them! ( or specify them for someone else to buy . ± 6-18 months down the road it be redundant energy costs will go up !!
ASICS /Tpu — Photonics—
Look at what Open AI have concluded !! FFS think to your self I wonder why they are going down that road and not buying more GPUs ?
GPUs where great when they were the the only option over CPU. Now there are options
A bit about GPU inefficiency~ they were repurposed from graphics rendering to AI workloads, which inherently creates bottlenecks. The architecture wasn’t originally designed for the matrix operations and data flows that modern AI requires.
The rental vs. purchase strategy is particularly astute from a capital allocation perspective. Given the 18-24 month hardware refresh cycles in AI, committing to owned GPU infrastructure could indeed leave Dfinity holding depreciated assets when more efficient alternatives emerge.
The shift toward specialized silicon is already underway:
ASICs offer dramatic efficiency gains for specific AI workloads
TPUs have shown superior performance-per-watt in many scenarios
Photonic computing promises to solve fundamental bandwidth and energy constraints
Even optical neural networks are moving from research to early commercial applications
OpenAI’s recent pivot away from pure GPU scaling toward custom silicon and alternative architectures signals where the industry leaders see the future heading. They’re clearly anticipating the same efficiency wall
For Dfinity, maintaining optionality while others lock into GPU-centric infrastructure could provide significant competitive advantage once next-generation compute becomes viable. The network that can seamlessly integrate emerging compute paradigms will likely have substantial cost and performance advantages over those built around today’s constraints.
(Last post for this week !)
Rethinking AI Compute
But there’s a major shift underway.
Photonics-Based AI Compute Is No Longer a Future Concept — It’s Here
Several companies now offer commercially available photonic AI accelerators and optical interconnects that dramatically outperform GPUs in:
Energy efficiency (up to 100x better)
Latency & bandwidth
Scalability across data centers
These systems are real, shipping, and in use:
Company Product Status Notes
Lightmatter Envise AI chip Shipping to partners Performs matrix ops optically
Passage interconnect Available Ultra-low latency photonic fabric
Lightelligence Hummingbird AI accelerator In early deployment Optical compute + electronics hybrid
Ayar Labs TeraPHY Optical I/O Production-ready Drop-in chiplet for optical data links
SuperNova Light Source Available Photonic chiplet ecosystem
Intel Silicon Photonics Transceivers Shipping widely Used by Amazon & Meta in AI centers
These aren’t “someday tech.” They’re being integrated into high-performance systems today — and can radically transform how we think about compute at Dfinity.
Why This Matters for Dfinity
The Internet Computer is built on scalable, composable compute — but AI workloads (e.g. Caffeine AI, RAG pipelines, LLMs) need:
Massive parallelism
Low latency inference
Scalable inter-node bandwidth
GPUs don’t fit well in this model:
They’re centralized and monopolized.
They don’t mesh natively with WASM or the TEE-secured model Dfinity supports.
They burn too much energy per token.
What We Could Do Today
Integrate photonic interconnects in subnet data centers
→ Swap InfiniBand with optical I/O (e.g., Ayar Labs)
Run LLM inference on photonic coprocessors (e.g., Lightmatter Envise)
→ Ideal for WASM+TEE offload, low power inference canisters
Pilot a “Photonics R&D Subnet”
→ Partner with Lightmatter or Ayar Labs to explore replacing GPU-based workloads
Use ICP’s secure canister architecture to prove AI results verified via optical hardware, backed by zk or TEEs
Let’s Lead
Dfinity can be the first blockchain ecosystem to adopt post-GPU AI infrastructure, paving the way for:
Sustainable decentralized AI
Trustless optical compute
Photonic inference at the edge
If anyone from the Dfinity Foundation is interested in discussing a pilot program or research collaboration, I’m happy to assist in bridging the hardware and software layers. The future of AI doesn’t have to be bottlenecked by NVIDIA.
Let’s light the path forward.
My point is that Dfinity should focus on the future specifically future proofing the network.
By going down the same road as everyone else they are throwing away the competitive edge (tiny but an edge imo) …The edge is that they have not gone down the GPU route. It’s outdated technology not designed to run AI l, but rather modified to run it, super inefficient.
GPUs will be redundant by 2026~27. (The only people pushing GPUs are those who hold Nvidia shares- it’s not the best option for a few reasons)
I understand that due to current demand they are needed but rent them don’t buy them! ( or specify them for someone else to buy . ± 6-18 months down the road it be redundant energy costs will go up !!
ASICS /Tpu — Photonics—
Look at what Open AI have concluded !! FFS think to your self I wonder why they are going down that road and not buying more GPUs ?
GPUs where great when they were the the only option over CPU. Now there are options