I am using a simple controller to change my shown ReactJS view like this:
getInitialState: function() {
return {view: null};
},
setViewFromHash: function () {
var that = this;
var address = window.location.hash;
if(address != "")
{
address = address.substring(1);
require(["jsx!" + address], function (View) {
that.setState({view: View});
});
}
else
{
require(["jsx!Home"], function (View) {
that.setState({view: View});
});
}
},
componentWillMount: function () {
var that = this;
window.onhashchange = function () {
that.setViewFromHash();
};
this.setViewFromHash();
},
onTitleUpdate: function(title, canonical) {
document.title = title + titleDefault;
$('link[rel=canonical]').prop('href', canonicalDefault + canonical);
},
render: function () {
var viewToLoad = null;
if (this.state.view === null) {
viewToLoad = "Loading...";
} else {
viewToLoad = this.state.view({ onTitleUpdate: this.onTitleUpdate });
}
return (
<article>
{viewToLoad}
</article>
);
}
In the view I trigger a callback:
var Home = React.createClass({
render: function () {
this.props.onTitleUpdate("Home", "");
...
My question is, does this callback occur at a point that would benefit SEO, ie the page title and canonical changes, are they updated in a way that triggers the Google bot to realize that the title and canonical have changed?
I'm also considering using Cortex for managing my data, will that be a better? Worse? No different? As far as SEO/Google bot's perception of the "page" it is viewing?
React is a JavaScript framework developed to build interactive and modular UIs. SEO is not a design goal of React but content websites built with React can be optimized to achieve better indexing and ranking.
But React is often a very good choice to build an SEO-friendly website as long as you set it up correctly. At Proxify we have many skilled React developers that can help you and make sure that your React site is optimized for both the user and SEO.
Google bots can index the page properly and rank it higher. Server-side rendering is the easiest way to create an SEO-friendly React website. However, if you want to create an SPA that will render on the server, you'll need to add an additional layer of Next. js.
React uses the concept of a virtual DOM to minimize the performance cost of re-rendering a webpage because the actual DOM is expensive to manipulate. This is great because it speeds up the UI render time. However, this concept can also slow down a complex app if it's not managed very well.
Google generally keeps the details about when they execute JavaScript under wraps; consider this quote from 2010:
"For a while, we were scanning within JavaScript, and we were looking for links. Google has gotten smarter about JavaScript and can execute some JavaScript. I wouldn't say that we execute all JavaScript, so there are some conditions in which we don't execute JavaScript. Certainly there are some common, well-known JavaScript things like Google Analytics, which you wouldn't even want to execute because you wouldn't want to try to generate phantom visits from Googlebot into your Google Analytics".
In the large, I would simply assume that Google (and certainly some other search engines) will not execute my JavaScript, meaning that any changes I make to the page will not be picked up. Of course, you can simply try it, and see what Google sees to know if it worked.
In either case, React has the ability to render on the server side and transparely hook up its event handlers on the client, giving you the ability to handle both easily. (You may need to re-jigger some code to make it run in both contexts.)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With