I am intending to user Angular Universal for server side rendering (SSR) but this should only be done for crawlers and bots from selected search engines.
What I want is the following schema:
source: https://dingyuliang.me/use-prerender-improve-angularjs-seo/
After following the official instructions to set up SSR I can now validate that Googlebot (finally) "sees" my website and should be able to index it.
However, at the moment all requests are rendered on the server. Is there a way to determine whether incoming requests are coming from search engines and pre-render the site only for them?
This is what I came up with IIS:
In order to get rid of complex folder structures, change the following line in server.ts
const distFolder = join(process.cwd(), 'dist/<Your Project>/browser');
to this:
const distFolder = process.cwd();
npm run build:ssr
command. You will end up with the browser
and server
folders inside the dist
folder.Create a folder for hosting in IIS and copy the files that are in the browser
and server
folders in to the created folder.
iis\
-assets\
-favicon.ico
-index.html
-main.js => this is the server file
-main-es2015.[...].js
-polyfills-es2015.[...].js
-runtime-es2015.[...].js
-scripts.[...].js
-...
Add a new file to this folder named web.config
with this content:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<system.webServer>
<rewrite>
<rules>
<rule name="Angular Routes" stopProcessing="true">
<match url=".*" />
<conditions logicalGrouping="MatchAll">
<add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" />
<add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" />
<add input="{HTTP_USER_AGENT}" pattern="(.*[Gg]ooglebot.*)|(.*[Bb]ingbot.*)" negate="true" />
</conditions>
<action type="Rewrite" url="/index.html" />
</rule>
<rule name="ReverseProxyInboundRule1" stopProcessing="true">
<match url=".*" />
<conditions>
<add input="{HTTP_USER_AGENT}" pattern="(.*[Gg]ooglebot.*)|(.*[Bb]ingbot.*)" />
</conditions>
<action type="Rewrite" url="http://localhost:4000/{R:0}" />
</rule>
</rules>
</rewrite>
<directoryBrowse enabled="false" />
</system.webServer>
</configuration>
Inside this folder open a Command Prompt or PowerShell and run the following:
> node main.js
Now you should be able to view your Server-Side Rendered website with localhost:4000
(if you haven't changed the port)
Install the IIS Rewrite Module
IIS will redirect requests that have googlebot
or bingbot
in them to localhost:4000
which is handled by Express and will return server-side rendered content.
You can test this with Google Chrome, open Developer Console, from the menu select "More tools>Network conditions". Then from the User Agent section disable "Select automatically" and choose Googlebot.
You can achieve that with Nginx.
In Nginx you can forward the request to the universal served angular application via..
if ($http_user_agent ~* "googlebot|yahoo|bingbot") {
proxy_pass 127.0.0.1:5000;
break;
}
root /var/www/html;
..assuming that you are serving angular universal via 127.0.0.1:5000
.
In case a browser user agent comes along, we serve the page via root /var/www/html
So the complete config would be something like..
server {
listen 80 default;
server_name angular.local;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header Host $http_host;
if ($http_user_agent ~* "googlebot|yahoo|bingbot") {
proxy_pass 127.0.0.1:5000;
break;
}
root /var/www/html;
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With