SSR for SEO: Why I Love Hwarang Uses Next.js Server Rendering
For a fundraising platform, discoverability through search engines is a survival requirement. Here's how SSR makes that possible.

I Love Hwarang is a fundraising platform. If potential donors can't find campaigns through Google, the platform fails. This single requirement — search engine discoverability — dictated the entire technical architecture.
Korean cultural heritage preservation is inherently a discovery problem. The target donors — Korean diaspora communities, cultural enthusiasts, and heritage organizations — are scattered across the internet. They discover campaigns through Google searches like 'support Korean temple restoration' or 'donate to hanbok preservation.' If our campaign pages don't rank for these queries, the campaigns fail. This isn't an optimization — it's the platform's core distribution mechanism.
“For a fundraising platform, discoverability through search engines is a survival requirement. Here's how SSR makes that possible.”
Client-side rendered SPAs are invisible to search engines by default. Google's crawler can execute JavaScript, but it's slow, unreliable, and doesn't handle dynamic content well. For a campaign page that needs to appear in search results within hours of creation, CSR is a non-starter.
Before committing to SSR, I evaluated three alternatives. Static Site Generation (SSG) with Next.js could pre-render campaign pages at build time, but new campaigns need to be discoverable immediately — waiting for a rebuild isn't acceptable. Incremental Static Regeneration (ISR) was closer, but the revalidation delay still meant a 60-second window where a newly created campaign served stale content. Pre-rendering services like Prerender.io could serve cached HTML to crawlers while keeping the SPA for users, but this added a dependency and introduced cache staleness issues. SSR was the only approach that guaranteed fresh, crawlable content on every request.
Next.js server-side rendering solves this cleanly. When Google's crawler requests a campaign page, our server renders the complete HTML — title, description, images, donation progress, everything. The crawler sees a fully-formed page, indexes it immediately, and the campaign starts appearing in search results within 24 hours.
The dynamic meta tag implementation required careful orchestration between the database, the rendering pipeline, and the HTML head. Each campaign page generates unique title tags, meta descriptions, and canonical URLs based on the campaign data in PostgreSQL. We use Next.js's generateMetadata function to fetch campaign details server-side and produce optimized meta tags. The title follows a consistent pattern — '[Campaign Name] | I Love Hwarang' — while the meta description pulls the first 160 characters of the campaign description, ensuring it reads naturally when truncated in search results.
Social sharing was equally critical. When someone shares a campaign on Facebook, Twitter, or KakaoTalk, the platform fetches Open Graph meta tags from the URL. With CSR, these tags aren't available because they're generated client-side. With SSR, the meta tags are in the HTML response, and social shares render rich previews with the campaign image, title, and description.
Beyond Open Graph tags, we implemented JSON-LD structured data using Schema.org's DonateAction and Event schemas. This tells Google's crawler not just that a page exists, but what it represents: a fundraising campaign with a specific goal amount, current progress, deadline, and organizing entity. The structured data has led to rich search results that display donation progress directly in Google's search listings — a significant click-through rate improvement over plain blue links.
The performance benefit was a bonus. Server-rendered pages show meaningful content on the first paint — no loading spinners, no content flashing in. For a donation page, this matters: every millisecond of delay reduces conversion rates. Our Time to First Meaningful Paint is under 800ms, which puts us in the top tier for fundraising platforms.
The caching strategy for SSR pages balances freshness with performance. Campaign pages that haven't received a donation in the last hour are cached at the CDN level with a 5-minute TTL — short enough to reflect recent donations, long enough to prevent redundant server renders. When a new donation arrives, we invalidate the campaign's CDN cache using Vercel's on-demand revalidation API, ensuring the next visitor sees the updated progress. This hybrid approach gives us SSR's freshness guarantees with near-static-site performance for less active campaigns.
The trade-off is infrastructure complexity. SSR requires a running server (we use Vercel), while a static site can be hosted on a CDN for pennies. But for a fundraising platform where discoverability directly impacts donations, the trade-off is obvious. SEO isn't a nice-to-have — it's the primary distribution channel.
We measure SEO impact rigorously using Google Search Console data piped into our analytics dashboard. In the first six months after launching with SSR, organic search traffic accounted for 42% of all campaign page visits — up from essentially zero when we briefly experimented with a CSR-only prototype. The average time to first Google indexing for new campaign pages is 18 hours, and top-performing campaigns reach the first page of results for targeted Korean heritage keywords within a week. These metrics directly correlate with donation volume: campaigns that rank in the top three search results raise 4x more than those below the fold.
I Love Hwarang is a fundraising platform. If potential donors can't find campaigns through Google, the platform fails. This single requirement — search engine discoverability — dictated the entire technical architecture.
Korean cultural heritage preservation is inherently a discovery problem. The target donors — Korean diaspora communities, cultural enthusiasts, and heritage organizations — are scattered across the internet. They discover campaigns through Google searches like 'support Korean temple restoration' or 'donate to hanbok preservation.' If our campaign pages don't rank for these queries, the campaigns fail. This isn't an optimization — it's the platform's core distribution mechanism.
Client-side rendered SPAs are invisible to search engines by default. Google's crawler can execute JavaScript, but it's slow, unreliable, and doesn't handle dynamic content well. For a campaign page that needs to appear in search results within hours of creation, CSR is a non-starter.
Before committing to SSR, I evaluated three alternatives. Static Site Generation (SSG) with Next.js could pre-render campaign pages
...
Tags: Next.js, SEO, SSR, Web
See Also:
→ The Five-Word Quiz That Fills an Empty Deck on Day One→ AI Agents Are Replacing the Traditional Software Development Lifecycle→ Building a Multi-Tenant Marketplace from Scratch→ PostgreSQL vs Firestore: A Practical Decision Framework→ How GenAI Reduced Our Operational Overhead by 90%Browse all articles →Key Facts
- • Category: Dev
- • Reading time: 10 min read
- • Technology: Next.js
- • Technology: SEO
- • Technology: SSR