Zero to #2 on Google: How to Rank High-Intent Keywords in Spain

Projects· 5 min read

Zero to #2 on Google: How to Rank High-Intent Keywords in Spain

Three months ago I launched Conversor IAE CNAE, a free tool for Spanish freelancers and businesses to convert between IAE and CNAE 2025 codes.

Today it ranks #2 on Google for "conversor iae cnae" and receives 200+ monthly organic searches.

I didn't hire an SEO agency. I didn't buy backlinks. I didn't stuff keywords.

What I did was apply Munger's principle: "Invert, always invert". Instead of thinking about how to rank, I thought about how Google decides what to show first. And I built from there.

Here's the exact system.

Why This Project Had Odds

Before diving into tactics, you need to understand why Conversor IAE CNAE had a shot at ranking:

1. Perfect intent match

  • Someone searching "conversor iae cnae" wants to convert IAE to CNAE codes
  • My tool does exactly that
  • No ambiguity

2. Vertical, specific market

  • 3.2 million freelancers in Spain
  • High-intent searches (not curiosity)
  • Low competition compared to other niches

3. Structured, trustworthy data

  • 2,247 official codes (1,187 IAE + 1,060 CNAE)
  • Verifiable public sources
  • Inherent authority

But having a good product isn't enough. I needed Google to find it and understand it.

Phase 1: Technical Foundations (Weeks 1-2)

My first move was forgetting traditional SEO and focusing on making sure Google could index correctly.

Technology stack:

  • Next.js 15.5.4 (for SSR and performance)
  • Supabase PostgreSQL (reliable data)
  • Vercel (hosting with CDN)
  • TypeScript (no type errors)

Specific steps:

Configure Next.js for SEO

```typescript // next.config.ts export default { trailingSlash: false, // Clean URLs compress: true, poweredByHeader: false, generateEtags: true, }; ```

This detail is critical. Google prefers URLs without trailing slashes. My commit on 12/20/2025 was specifically for this: `chore: add explicit trailingSlash: false to next.config.ts`

Sitemap and Robots.txt

I generated a dynamic sitemap with all 2,247 codes:

```typescript // app/sitemap.ts export default async function sitemap() { const codes = await fetchAllCodes(); // From Supabase return codes.map(code => ({ url: `https://conversoriaecnae.es/codigo/${code.id}`, lastModified: new Date(), priority: 0.8, })); } ```

My commit on 12/21/2025 (`fix: resolve 7 GSC robots.txt warnings`) showed me that Google Search Console was detecting robots.txt errors. I fixed them all. That's when I started ranking.

Phase 2: Structure and Content (Weeks 3-4)

Google understands structure. I organized content like this:

Homepage

  • H1: "Conversor IAE CNAE 2025"
  • Clear problem explanation
  • Main CTA (search)
  • FAQ with schema markup

Individual Code Pages

  • URL: `/codigo/[id]`
  • H1: "Código IAE 5821 - Business Consulting"
  • Official description
  • Equivalent CNAE code
  • Context and use cases

Optimized FAQ

My commit on 12/20/2025 was crucial: `fixed faq templates with categories for a better indexation in google`

It's not a minor detail. Google uses FAQs for featured snippets. I changed my structure from:

```json // BEFORE (flat) { "faq": [ { "question": "What is IAE?", "answer": "..." } ] }

// AFTER (with categories) { "faqByCategory": { "basics": [ { "question": "What is IAE?", "answer": "..." } ], "conversion": [ { "question": "How do I convert IAE to CNAE?", "answer": "..." } ] } } ```

That generated more indexable snippets.

Phase 3: Maintenance and Fixes (Weeks 5-12)

This is where most people fail. They think launch and done.

I monitored Google Search Console every single day.

The 404 Error Trap

My commit on 12/19/2025: `Add 301 redirects to fix GSC 404 errors`

Google showed me routes returning 404s. Why? Because I changed the URL structure during development.

I created 301 redirects for each one:

```typescript // middleware.ts export function middleware(request: NextRequest) { const oldPaths: Record<string, string> = { '/iae/5821': '/codigo/5821', '/cnae/6202': '/codigo/6202', };

const pathname = request.nextUrl.pathname; if (oldPaths[pathname]) { return NextResponse.redirect( new URL(oldPaths[pathname], request.url), { status: 301 } ); } } ```

This is critical: 301 redirects pass authority. If you have backlinks to old URLs, 301s redirect that authority to the new site.

URL Normalization

My commit on 12/21/2025: `fix: resolve 7 GSC robots.txt warnings and add URL normalization middleware`

Google detected duplicate URLs. I implemented:

```typescript // Middleware to normalize if (pathname !== pathname.toLowerCase()) { return NextResponse.redirect( new URL(pathname.toLowerCase(), request.url), { status: 301 } ); } ```

Performance

My stack uses:

  • **Next.js Image Optimization**: Images served in WebP
  • **Vercel Analytics**: Real monitoring of Core Web Vitals
  • **React Query**: Smart data caching

Google loves speed. My LCP (Largest Contentful Paint) is 0.8s.

Phase 4: The Final Push (Week 12)

My most recent commit on 12/27/2025: `fix: SEO Phase 1-3 improvements - indexation, UX, and 404 pages`

This included:

1. Custom 404 page with links to popular pages 2. Complete schema markup (FAQPage, WebSite, SearchAction) 3. Dynamic meta descriptions for each code 4. Open Graph tags for social sharing

The Real Numbers

  • **Time**: 12 weeks
  • **Current position**: #2 (sometimes #1)
  • **Monthly organic traffic**: 200+ searches
  • **Conversion**: 8-12% of users actively use the tool
  • **Cost**: $0 in marketing

What I Didn't Do

  • Didn't buy backlinks
  • Didn't write guest posts
  • Didn't pay for ads
  • Didn't stuff keywords
  • Didn't copy competitor content

What I Learned

1. Google understands intent If your product solves exactly what someone is searching for, you'll rank. You don't need tricks.

2. Technical details win While others talk about "quality content", I was fixing trailing slashes and robots.txt.

3. Constant monitoring Google Search Console showed me every error before it became a problem. I checked every day.

4. 301 redirects are underestimated Most startups change URLs without thinking. That kills rankings. 301s save lives.

Your Turn

If you have a product or idea that solves a specific problem in your market:

1. Identify high-intent keywords (people searching for exactly your solution) 2. Build the tool first, SEO after 3. Implement technical fundamentals (sitemap, robots.txt, trailing slashes) 4. Monitor Google Search Console every day 5. Fix 404s and errors before ranking 6. Be patient. It's not magic. It's 12 weeks of consistent work.

You don't need agencies. You need discipline.

Building something similar? Share your progress. Building in public accelerates everything.