For example, a request for @dwr's profile from TwitterBot will go to https://ssr.warpcast.com/dwr, which will server-side render OG meta tags and some very basic HTML for SEO. All non-bot traffic will just get served https://warpcast.com/dwr. This approach adds some maintenance burden, but the benefits are
that 1) the main web app isn't burdened with additional complexity relating to SSR, 2) almost all incoming requests just require us to serve static assets, and 3) a separation of concerns between OG/SEO logic for robots and an actual web app for humans
Is there a standard used for identifying bots via user agent? I've seen some lists floating around, but not necessarily a standard.
Did you consider a server-first approach, like Express or Deno Fresh?