From Lovable to Framer: why rebuilding my website was more of an AI strategy decision, than a simple redesign

SPA, SSR, dedicated service pages, schema per page. The structural decisions that determine whether AI systems can find and cite your business.

Date

/

Category

Web development - GEO

Web development - GEO

/

Writer

Alice - Founder at Zest

Alice - Founder at Zest

Minimalistic building architecture

I'll be honest with you. Rebuilding a website from scratch takes time, energy, and a fair amount of second-guessing. So when people ask me why I moved Zest from Lovable to Framer, the short answer is: AI made me do it.

The longer answer starts with an audit. A few weeks ago I ran a full AI readiness check on my own site, the same audit I now offer clients, and what came back was not flattering. Perplexity was serving cached content that didn't reflect what Zest actually does. ChatGPT didn't cite me at all. Google AIO picked me up on one query out of five. The conclusion was simple: the technical fixes I could apply on Lovable had a ceiling, and that ceiling was the architecture itself.

Lovable could technically handle a multi-page structure. But since I was rebuilding from scratch anyway, I wanted a tool designed for it, with proper SSR out of the box and templates that don't require a design background to look good. Framer checked both boxes.

This article is about what comes after that diagnosis.


Is your website actually readable by AI search engines?

Most websites are not. Not because they're poorly designed, but because they were built for human eyes, not machine readers.

When ChatGPT, Perplexity or Google's AI Overviews look for sources to cite, they send a crawler that reads raw HTML. If your content lives inside JavaScript that only renders after the page loads in a browser, the crawler often leaves empty-handed. It sees the shell, not the substance.

This is the core problem with Single Page Applications. The entire site lives on one URL, one HTML file, and the content is assembled dynamically by JavaScript in the browser. For a human visitor it works perfectly. For an AI crawler it can be close to invisible.

On my Lovable site, that's exactly what was happening. The meta description was wrong, the JSON-LD schema was incomplete, and my name as the founder wasn't readable as plain text. I fixed what I could: corrected the schema, added a FAQPage, deployed llms.txt, standardised the canonical. Perplexity started reading the right description. But the structural ceiling remained. One URL, one page, one entity signal for every AI system trying to understand what Zest does and who runs it.

I documented the before and after in a previous article here, if you want the technical detail.


What is the difference between a SPA and an SSR site for AI visibility?

Think of it this way. A Single Page Application is like a restaurant that only shows you the menu after you sit down, order, and wait. A server-side rendered site is one where the menu is already on the table when you walk in.

AI crawlers are impatient customers. If they have to wait for JavaScript to load and assemble your content, many of them leave before the menu arrives. What they index, and potentially cite, is the empty table.

With a properly built multi-page site, each page is a complete HTML file. Your homepage must tell in under five seconds what you do and for whom. Your service pages signal what you offer. Your about page signals who runs the business. Each URL is a distinct, citable entity.

This is not just a crawlability argument. Tools like ChatGPT and Google AI Mode use what is called query fan-out: when someone asks a complex question, the system breaks it down into several sub-queries running simultaneously. One for agencies in Dubai, one for GEO services, one for AI search optimization. If you only have one URL covering everything, you are only catchable on a fraction of those sub-queries. Dedicated pages are not about organisation. They are about surface area.


What does an AI-ready site structure look like in 2026?

It looks like a site built to be read by machines as much as by humans.

Each service has its own dedicated page. Not a section on a homepage that scrolls past, but a proper URL with its own title, meta description, schema markup, and FAQ section. /seo-geo-ux-optimization, /ai-and-automation, /brand-positioning (these are my service pages), each one is a distinct, citable entity. The goal is simple: when someone asks Perplexity who does AI search optimization in Dubai, I want there to be a page that answers exactly that question. Whether that works is what the next few months of data will tell me.

The blog runs on a CMS with published and modified dates on every post. Publication dates matter for Article schema validation, and they signal recency to search crawlers. A post without a date is a source without a reference.

The /about page exists as a standalone URL where my name, my role, and my background are readable as plain text. Just there, in the HTML, for any crawler that wants to know who runs this business.

Every page has its own JSON-LD schema injected in the head section of the page code. ProfessionalService on the homepage, Article on blog posts, FAQPage on service pages. Not one global block for the entire site.


What do you need on a dedicated service page to appear in AI search results?

Three things: a direct opening, question-based headings, and proper schema markup.

The opening paragraph should answer the most obvious question about that service in two or three sentences. Not a tagline, just a clear declarative statement. AI systems extract the first readable content they find, and if that content is vague or promotional, they move on.

The headings should read like real questions. Not "Our approach" but "What is the difference between GEO and traditional SEO?" These are the exact phrases people type into Perplexity and ChatGPT. The FAQ section follows the same logic: questions in natural language, answers between 40 and 60 words, front-loaded with the actual response.

Each page also needs its own FAQPage schema. A service page about brand positioning and one about AI automation have nothing in common from a crawler's perspective. Treating them as the same entity in your schema is a missed opportunity.


Is migrating your website worth it for AI visibility alone?

Probably not. And I want to be straight about that.

If your current site has a proper multi-page structure and you are just missing schema markup and a few FAQ sections, a migration is overkill. You can get surprisingly far with targeted technical fixes, which is exactly what the AI readiness audit is designed to surface.

But if you are already considering a rebuild for other reasons, the architectural decisions you make have real consequences for AI visibility. Choosing SSR over SPA, building dedicated pages per service, setting up a blog CMS with proper timestamps: none of these cost extra time if planned from the start. They cost a lot of time to retrofit later.

That was my situation. I was rebuilding anyway. The AI readiness layer was not the only reason I migrated, but it shaped every structural decision I made along the way.

What I can say now is that the structure is right. And that is the part you can control before the results come in.


Want to know if your site structure is actually working for AI visibility?

If you are building or rebuilding a website and want to make sure the architecture decisions are right from the start, that is exactly what the AI readiness audit covers. Book a call and we will go through it together.

You can reach me at hello@zest-your-business.com .

And if you want to follow the next step of this experiment, I'll be documenting the actual AI visibility results on the Framer site in a few months. That's also where I'll get into how SEO and GEO work together rather than compete, because architecture is only half the equation.