Hacker News Viewer

MZI-based transistorlessness might finally be here

by aniijbod on 4/14/2026, 7:37:51 PM

https://write.as/mnggfj7asl07k

Comments

by: zitterbewegung

This is something I always thought if you could make an Optical Computer using MZI or other technologies that don&#x27;t have very &quot;exact&quot; requirements on computation. Similar to how LLMs right now where models are run on consumer devices like a Macbook Pro where we quantize to 4 bit computations you could hypothetically just run a larger model using MZI&#x27;s doing inference on those systems.<p>Since you are only changing the underlying model every so often instead of doing a large training loop when you setup the optical computer that can do inference it scales 2n+1 with clock speeds of to 100THz with only 100w of power vs traditional GPUs at 2GHZ with 1Kw for 15k cores.

4/17/2026, 9:15:19 PM


by: LeroyRaz

This reads like something AI generated...

4/17/2026, 9:06:33 PM


by: refulgentis

&quot;That’s not a lab toy. That’s a product-trajectory data point.&quot;<p>sigh. (why? because now I have to guess how much is vague handwaving, or an AI trying to fit a square peg into a round hole, and how much is reality)

4/17/2026, 9:05:13 PM


by: refulgentis

I get the &quot;AI uses &lt; 32 bit weights sometimes&quot; thing, intimately, but I feel like I&#x27;m missing:<p>A) Why that means calculations can be imprecise - the weights are data stored in RAM, is the idea we&#x27;d use &gt; N-bit weights and say it&#x27;s effectively N-bit due to imprecision, so we&#x27;re good? Because that&#x27;d cancel out the advantage of using &lt; N-bit weights. (which, of course, is fine if B) has a strong answer)<p>B) A aside, why is photonics preferable?

4/17/2026, 9:03:19 PM


by: irickt

An interesting and accessible article on the increased plausibility of photonic compute.

4/17/2026, 8:38:52 PM