Strategy · Philippines

New Year, No Predictions

January 30, 20242 min read

We published a predictions post last January. We went back and read it in December. Our track record was about what you would expect: the things that were already happening continued, the things we thought might happen mostly did not, and the most significant developments of the year were not on our list at all.

We are not publishing predictions this year. Here is the short version of why, and what we are paying attention to instead.

Why We Stopped

Predictions posts serve the writer more than the reader. They create an appearance of foresight and, if the prediction happens to be correct, a quotable record. If the predictions are wrong, the post is quietly forgotten. The asymmetry is too convenient to be useful.

The more honest version: we do not have special predictive ability about where the software industry or the AI space is going. Neither does most of the content that confidently declares what the next year will bring. The industry moves fast enough that six-month-old predictions read like they were written in a different era. Twelve-month predictions have a survival rate that should make anyone humble.

For a studio our size, predictions are also an odd use of attention. We are not shaping industry direction. We are tracking it, and our clients are better served by us being accurate about what is already happening than by us being confident about what might.

What We Are Actually Watching

Three things that seem more useful to observe than predict.

AI agents moving from demo to deployment. The conversation in 2023 was about what language models could do in a controlled environment. The question for 2024 is what happens when they are deployed in real workflows, with real messy data and real user behavior. We are more interested in what fails in production than in what works in demos.

The cost curve on AI services. Model inference costs have dropped significantly in the past year. That trajectory affects what becomes feasible to build for clients in the mid-market rather than just enterprise. We are watching which capabilities cross from expensive-and-niche to affordable-and-practical.

Client readiness for AI adoption. Not every business that wants an AI feature is ready to actually use one. The infrastructure question - data quality, workflow integration, change management - is often more consequential than the model selection question. We are paying closer attention to the organizational side of AI integration.

What This Means Practically

We are spending more time on structured discovery and less time on speculative positioning. When clients ask us about AI, we want to give them an honest picture of what is production-ready for their specific use case, not a projection of where the industry might be in eighteen months.

We are also investing in our own understanding of AI solutions and automation through building rather than reading. The studios that will give accurate counsel in 2024 are the ones that have shipped AI features, learned from what broke, and iterated. That is the record we are building.

No predictions. Just attention.

Start a project →

Need this built for your business?

Let's scope it together.

Start a project