What I suggest, is that people experiment with them. At the least, read about what others are doing, but pay attention to the details of their workflows. Preferably experiment yourself, and do share your experiences.

Hallucinations aren’t a bug of LLMs, they are a feature. Indeed they are the feature. All an LLM does is produce hallucinations, it’s just that we find some of them useful.

The difference in the answers can be as useful as the answers themselves.

A structural engineer builds in tolerance for all the factors she can’t measure. (I remember being told early in my career that the unique characteristic of digital electronics was that there was no concept of tolerances.) Process engineers consider that humans are executing tasks, and will sometimes be forgetful or careless. Software Engineering is unusual in that it works with deterministic machines. Maybe LLMs mark the point where we join our engineering peers in a world on non-determinism.

The entire concept of an agentic browser extension is fatally flawed and cannot be built safely”.

Technically, Google can store every message you receive and know everything, and U.S. agencies can request access to that data. So I decided to switch to another provider, one that respects privacy a bit more.

I decided to move all my emails from Gmail to mailbox.org, so I could (in future) completely wipe my Gmail account.