Analytics API: What Data You Actually Get (and When You Need to Add It)

Programming· 6 min read

The Problem: Analytics As an Afterthought

You've been building your SaaS for three months. The API works well, users are happy, and suddenly your boss (or your entrepreneurial instinct) asks: "How many users have seen this feature? How many have used it?"

And here's where everything falls apart.

Not because you don't have data. You do. But it's scattered across logs nobody reads, events you never captured, and metrics living in Vercel Analytics that tell you nothing useful about user behavior.

Most developers make the same mistake: they integrate analytics directly into the API core. Create dependencies. Slow down responses. And when they need to switch providers (Mixpanel, Segment, Posthog), they have to refactor everything.

It doesn't have to be this way.

The Right Architecture: Analytics As an Optional Layer

Think of it this way: your API is the heart of your application. Analytics is the nervous system that monitors how that heart beats. It shouldn't be coupled.

The key is designing analytics as a decoupled add-on that can be there or not, without affecting the main flow.

This means:

1. Events, not real-time data: Instead of running complex queries when something happens, emit simple events 2. Asynchronous processing: Analytics never block your HTTP response 3. Flexible provider: Switching from Posthog to Segment doesn't require touching your API core

The Real Flow

``` User clicks ↓ API processes the action ↓ API responds (200 OK) ↓ Eventually, analytics event is emitted ↓ Analytics provider receives it ```

Notice what's important: the user never waits for analytics to process.

Practical Implementation: What Data to Capture

Now comes the question everyone asks: what exactly do I measure?

There are three categories of events that matter:

1. Impressions

A user sees something. This could be:

  • Page load
  • Component view
  • Modal or dialog appearance

This is the easiest to capture. On the frontend, you typically do it with a simple POST to your API:

```javascript // In your React/Next.js component const trackImpression = async (componentId) => { try { await fetch('/api/events/impression', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ componentId, timestamp: new Date().toISOString(), userId: getCurrentUserId(), sessionId: getSessionId() }) }) } catch (e) { // Silently fail. Analytics should never break the UX console.error('Analytics error:', e) } }

// Use this in useEffect useEffect(() => { trackImpression('hero-section') }, []) ```

2. Clicks and Interactions

The user does something active. Examples:

  • Button click
  • Form submission
  • File download

These are more valuable than impressions because they indicate real intent.

```javascript const trackClick = async (buttonId, metadata = {}) => { await fetch('/api/events/click', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ buttonId, userId: getCurrentUserId(), sessionId: getSessionId(), timestamp: new Date().toISOString(), ...metadata // You can add contextual data }) }) }

// On your button <button onClick={() => trackClick('upgrade-button', { plan: 'pro' })}> Upgrade </button> ```

3. Engagement

This is more sophisticated. It measures how users interact with your product:

  • Time on page
  • Scroll depth
  • Searches performed
  • Filters applied

Backend Side: Receiving and Storing Events

Now, how do you process these events in your API?

The right answer is: don't process them immediately.

Instead, emit them to a queue. This could be:

  • A table in your database (Supabase, for example)
  • A queue service (Bull, RabbitMQ)
  • An external service (Vercel KV, Redis)

```typescript // In your API endpoint (Next.js) export async function POST(req: Request) { const body = await req.json()

// Validate minimally if (!body.userId || !body.componentId) { return Response.json({ error: 'Invalid event' }, { status: 400 }) }

// Emit the event to a queue (without waiting) queueEvent({ type: 'impression', payload: body, timestamp: Date.now() }).catch(err => { // Log, but don't fail the response console.error('Queue error:', err) })

// Respond immediately return Response.json({ success: true }, { status: 200 }) } ```

Then, an asynchronous worker processes those events:

```typescript // A Cron job or worker that runs every minute export async function handleAnalyticsQueue() { const events = await getQueuedEvents()

for (const event of events) { // Send to your analytics provider await posthog.capture({ distinctId: event.payload.userId, event: event.type, properties: event.payload })

// Mark as processed await markEventAsProcessed(event.id) } } ```

Choosing the Right Provider

This depends on your needs:

  • **Posthog**: Open source, self-hosted option, good for startups
  • **Segment**: Centralizes events from multiple sources
  • **Mixpanel**: Focused on product analytics
  • **Google Analytics 4**: Free, but less flexible
  • **Plausible**: Privacy-first, good if GDPR is critical

The advantage of your decoupled architecture is you can switch between them without touching your API core.

The Important Aspect Many Forget: Privacy

We're in Spain/Europe. GDPR is real.

When you capture user data: 1. You need consent (typically via cookie banner) 2. You must anonymize sensitive data (don't store emails in click events) 3. Allow opt-out (some users don't want to be tracked)

A secure implementation:

```javascript const trackEvent = async (eventName, data) => { // Check that user has consented to analytics if (!hasAnalyticsConsent()) { return }

// Don't include sensitive personal data const safeData = { ...data, email: undefined, // Never phone: undefined, // Never userId: hashUserId(data.userId) // Anonymous }

await fetch('/api/events', { method: 'POST', body: JSON.stringify(safeData) }) } ```

Real Case: How I Use This in My Projects

In my SaaS typically:

1. Day 1-30: No analytics. Just basic logs in Vercel. 2. Day 30-90: Add an `/api/events` endpoint that saves to a Supabase table. 3. Day 90+: Connect Posthog if I need complex dashboards.

This progression works because I never coupled analytics to the core.

When I decided to switch to Posthog, I only had to: 1. Add the Posthog SDK 2. Change the worker to send to Posthog instead of Supabase 3. Done. Zero changes to the API core.

Takeaway: Design for the Future

The lesson here isn't about which tool to use. It's about architecture.

When you build APIs, think in layers:

  • Core (business logic)
  • Observability (logs, errors)
  • Analytics (user behavior)
  • Monetization (payments, limits)

Each layer should be independent. Each should be able to change without breaking the others.

Analytics as an optional add-on is just one application of this principle.

Build this way from the start. Your future self will thank you when you need to pivot.