Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to change route and content without page refresh? (robot friendly)

I wanna know if it's possible to change the url showing and according to that change the content of the page and make the urls and the content of the page to be robot friendly (which means robots can actually index them).

I've already tried using AJAX to dynamically load the data and using angularjs routing but none of them can be indexed by the robots.

also pretty urls and query strings are not what I'm looking for, I'm looking for a theory that renders the data at landing and changes the route and the content on click of links without page refresh and I don't want to write the code twice (once in server side and once in front-end).

these were the things I've already tried, any help or direction to the solution would be appreciated.

UPDATE

A no library solution/structure that would work on all languages with no dependency would be the most accurate answer!

like image 808
Amin Jafari Avatar asked Dec 14 '22 06:12

Amin Jafari


2 Answers

Here is something that could represent a starting point for a solution. Before you go on reading these are the main things to keep in mind about my answer:

  • all vanilla javascript
  • ajax call to load new content
  • change url on address bar without reloading the page
  • register url changes in browser history
  • seo friendly

But be aware that all is presented in draft code meant to explain the solution, you'll need to improve the code if you want to implement it on production.

Let's start with the index page.

index.php

<!DOCTYPE html>
<html>
<head>
    <title>Sample page</title>
    <meta charset="UTF-8">
    <script type="text/javascript" src="ajax_loader.js"></script>
</head>
<body>

<h1>Some static content</h1>
<a href="?main_content=external_content.php">
    Link to load dynamic content
</a>
<div id="main_content">
    <!--
        Here is where your dynamic content will be loaded.

        You can have as many dynamic container as you like.

        In my basic example you can attach one link to a
        single container but you can implement a more
        complete solution to handle multiple containers
        at the same time
    -->

    <!-- Leave this empty for the moment... some php will follow -->
</div>
</body>
</html>

Now let's see how the javascript can handle the links for loading content with ajax

ajax_loader.js

window.onload = function() {

        var load = function(e) {
            // prevent browser to load link
            event.preventDefault();

            // exit if target is undefined
            if(typeof(e.target) == 'undefined' ) {return;}

            // exit if clicked element is not a link
            if (e.target.tagName !== 'A') {return;}

            // get href from clicked element
            var href = e.target.getAttribute("href");

            // retrieve container and source
            var href_parts = href.split('=');
            var container = href_parts[0].substr(1);
            var source = href_parts[1];

            // instantiate a new request
            var request = new XMLHttpRequest();

            // bind a function to handle request status
            request.onreadystatechange = function() {
                if(request.readyState < 4) {
                    // handle preload
                    return;
                }
                if(request.status !== 200) {
                    // handle error
                    return;
                }
                if(request.readyState === 4) {
                    // handle successful request
                    successCallback();
                }
            };

            // open the request to the specified source
            request.open('GET', source, true);
            // execute the request
            request.send('');

            successCallback = function() {
                // on success place response content in the specified container
                document.getElementById(container).innerHTML = request.responseText;

                // change url in the address bar and save it in the history
                history.pushState('','',"?"+container+"="+source);
            }
        };

        // add an event listener to the entire document.
        document.addEventListener('click', load, false);
        // the reason why the event listener is attached
        // to the whole document and not only to the <a>
        // elements in the page is that otherwise the links
        // included in the dynamic content would not
        // liste to the click event

    };

now let's give a look back to some specific elements of our html

As said before the proposed script will attach the behavior to any link, you only need to format it so to be read properly by the load() function. The format is "?container_name=filename.php". Where container_name is the id of the div in which you want the content to be loaded in, while filename.php is the name of the file to be called by ajax to retrieve the content.

So if you have some content in your 'external_content.php' file and want it loaded in the div with id 'main_content' here is what you do

<a href="?main_content=external_content.php">Your link</a>
<div id="main_content"></div>

In this example the div 'main_content' is empty when the page first loads and will be populated on click of your link with the content of the external_content.php file. At the same time the address bar of your browser will change from http://www.example.com/index.php to http://www.example.com/index.php?main_content=external_content.php and this new url will be registered in your browser history

Now let's go further and see how we can make this SEO friendly so that http://www.example.com/index.php?main_content=external_content.php is a real address and the 'main_content' div is not empty when we load the page.

We can just add some php code to handle this. (Please not that you could even write some javascript to to a similar job, but since you mentioned the use of server side language I decided to go for php)

<a href="?main_content=external_content.php">Load</a>
<div id="main_content">
    <?php dynamicLoad('main_content','default_main_content.php'); ?>
</div>

Before showing it I want to explain what the php function dynamicLoad() does. It take two parameters, the first is equivalent to the container id, the second if the file where the default content is. To be more clear, if the requested url is http://www.example.com/ the function will put the content of default_main_content.php in the main_content div but if the url requested by the browser is http://www.example.com/index.php?main_content=external_content.php then function will put the content of external_content.php in the main_content div.

This mechanism helps the page to be SEO friendly and user friendly, so when the search engine crawler will follow the href "?main_content=external_content.php" that brings to the url "http://www.example.com/index.php?main_content=external_content.php" will find the same content displayed dynamically with the ajax call. And this is also true for the user who will reload the page with a refresh or from the history.

Here is the simple dynamicLoad() php function

<?php
    function dynamicLoad($contaner,$defaultSource){
        $loadSource = $defaultSource;
        if(isset($_GET[$contaner])) {
            $loadSource = $_GET[$contaner];
        }
        include($loadSource);
    }
?>

As said in the first lines, this is not a code ready for production, it's just the explanation of a possible solution to the request you made

to change the url showing and according to that change the content of the page and make the urls and the content of the page to be robot friendly

like image 107
Igor S Om Avatar answered Dec 21 '22 09:12

Igor S Om


If you are really care about your SEO you should not use AJAX to fill dynamically you site, this is not about Google spiders because the can read JavaScript in simpel way but for other search engine spiders.

The best and oldest approach is to use normal routes but you can simulate them with nodeJS and react so you can use Javascript to fill your content this is called Isomorphic if i have it correctly.

http://nerds.airbnb.com/isomorphic-javascript-future-web-apps/

update:

An application that can only run in the client-side cannot serve HTML to crawlers, so it will have poor SEO by default. Web crawlers function by making a request to a web server and interpreting the result; but if the server returns a blank page, it’s not of much value. There are workarounds, but not without jumping through some hoops. source Airbnb website

The difference is that the user expires the speed of client side rendering and the web crawlers get there content from the server.

like image 45
J. Overmars Avatar answered Dec 21 '22 10:12

J. Overmars