The idea for TownSquare came from a specific frustration, not a business plan. We were spending too much time on social media, feeling vaguely worse after each session, and trying to figure out exactly why.
Part of it was the echo chamber — the sense that the platform was gradually narrowing what we saw. But what bothered us more was the opacity. When a post got 200 likes, there was no way to know what those 200 people were thinking. Were they impressed? Amused? Just agreeing? Were they the right 200 people to validate the post, or were they all coming from a bubble that would have liked almost anything?
We wanted something that gave engagement a reason. In mid-2025, we started building it.
Choosing the Stack
The first decision was architecture. Building a social platform from scratch is a substantial undertaking, and we knew we'd need to make some pragmatic choices about where to invest engineering effort versus where to leverage existing infrastructure.
We landed on Firebase — Firestore for the persistent data layer, Firebase Realtime Database for ephemeral low-latency features like typing indicators and live presence, and Firebase Authentication for identity. The main app is built in vanilla JavaScript with Rollup as the bundler. We deliberately avoided heavy frontend frameworks for the web layer — Tailwind CSS for styling, but otherwise close to the metal. The advantages were significant: near-zero cold-start costs, real-time capabilities built in, and scalability that would let a small team punch above its weight.
The tradeoffs were real too. Firebase's pricing model means costs can spike unpredictably with traffic. Firestore's query model requires careful index management and imposes constraints on how you can filter and sort data. We've had to work around both, but the overall decision has held up well.
The Features That Took the Longest
The ELO system
Getting the ELO implementation right was the most intellectually demanding part of the build. The core formula is simple enough — expected score, actual score, K-factor adjustment. But adapting it for social content required decisions that pure chess ELO doesn't face: How do you handle posts that were created months ago receiving votes today? How do you weight votes from different reputation tiers? How do you prevent high-reputation users from gaming their scores by voting strategically?
The current system uses a dynamic K-factor that scales with the reputation gap between the voter and the content author. Votes from users with significantly higher ELO move the needle more. Votes from users with lower ELO matter, but less. This creates a system where your reputation is essentially a weighted average of how high-reputation people assess your work — which is a much better signal than raw vote counts.
Real-time everything
Modern social platforms feel live. Comments appear in real time. Notification badges update without a page refresh. Typing indicators show you when someone's replying. Building these features correctly — with proper cleanup to prevent memory leaks, proper conflict resolution when multiple users edit simultaneously, and proper fallback behavior when the connection drops — took significantly longer than we initially estimated.
Firebase Realtime Database handles presence and typing indicators. Firestore handles the live comment and notification streams. The two systems have different characteristics and require different mental models, which created some integration complexity early in the build.
The mobile app
Partway through development, it became clear that a web-only platform was leaving significant value on the table. We built the mobile app in React Native with Expo — sharing the same Firebase backend as the web app, with the same collections and the same data structures. Features that users expect on mobile (haptic feedback on votes, native camera integration for post images, push notifications, location-aware posts) required native integrations that took time to get right across iOS and Android.
What We Got Wrong (and Fixed)
We shipped a waitlist page early on that accidentally created a "private beta, invite only" perception. This was a mistake — it made TownSquare seem more exclusive than it was and blocked organic sign-ups. We replaced it with an open registration flow as soon as we caught it.
We also initially underestimated how much the service worker caching strategy would affect users' experience of updates. A service worker that aggressively caches static assets means that code changes don't immediately reach users who already have the old version cached. We implemented an automatic service worker version bump on every production deployment to force cache invalidation — a change that now runs automatically but required some trial and error to get right.
"The features that seem small — password reset flows, email verification, proper error messages — consistently take longer than expected and matter more than you think."
Where We Are Now
The platform launched with a feature set that we're proud of: the ELO reputation system, multi-axis voting, communities, notifications, a full mobile app, and a moderation system capable of handling real-world scale. The cloud functions that handle scheduled operations — reputation decay, email digests, analytics — are ready to deploy once we move to Firebase's Blaze plan.
The web app is live at thetownsquare.social. The iOS and Android apps are in final testing. We're at the point where the infrastructure is solid and the story we want to tell is clear. The next chapter is growth — finding the people who are as frustrated with current social media as we were, and giving them something better.
That's the part we're most excited about.