-
Hi,
Currently I am working on a small project, and to get the layout I want, I've implemented this script:
<script type="text/javascript">
var bustcachevar=1
var loadedobjects=""
var rootdomain="http://"+window.location.hostname
var bustcacheparameter=""
function ajaxpage(url, containerid){
var page_request = false
if (window.XMLHttpRequest)
page_request = new XMLHttpRequest()
else if (window.ActiveXObject){
try {
page_request = new ActiveXObject("Msxml2.XMLHTTP")
}
catch (e){
try{
page_request = new ActiveXObject("Microsoft.XMLHTTP")
}
catch (e){}
}
}
else
return false
page_request.onreadystatechange=function(){
loadpage(page_request, containerid)
}
if (bustcachevar)
bustcacheparameter=(url.indexOf("?")!=-1)? "&"+new Date().getTime() : "?"+new Date().getTime()
page_request.open('GET', url+bustcacheparameter, true)
page_request.send(null)
}
function loadpage(page_request, containerid){
if (page_request.readyState == 4 && (page_request.status==200 || window.location.href.indexOf("http")==-1))
document.getElementById(containerid).innerHTML=pag e_request.responseText
}
function loadobjs(){
if (!document.getElementById)
return
for (i=0; i<arguments.length; i++){
var file=arguments
var fileref=""
if (loadedobjects.indexOf(file)==-1){
if (file.indexOf(".js")!=-1){
fileref=document.createElement('script')
fileref.setAttribute("type","text/javascript");
fileref.setAttribute("src", file);
}
else if (file.indexOf(".css")!=-1){
fileref=document.createElement("link")
fileref.setAttribute("rel", "stylesheet");
fileref.setAttribute("type", "text/css");
fileref.setAttribute("href", file);
}
}
if (fileref!=""){
document.getElementsByTagName("head").item(0).appe ndChild(fileref)
loadedobjects+=file+" "
}
}
}
</script>
My question: Will this affect SEO adversely? As well, does using this script have any other adverse affects? It basically loads separate internal content into a specified <div>, as opposed to loading the entire page - but - I've heard it can hurt in the search engine department. Any advice is greatly appreciated, thank you.
-
JavaScript tags are generally ignored by search engines, what I mean by that is if a search crawler calls a page that has JavaScript on it, the crawler does not parse or execute the page, so if the page content relies on javascript, then that does affect SEO, but in general no it does not.
-
The content itself isn't created by the script, but rather called into the div. For example: I can have something written in the index, but when you click say the about link in the navigation, instead of loading about.html entirely, it calls the static content (or database content, however it is set up) from that page into the div. It never actually leaves the basic index page. Being that it works in such a way, never actually going to the page, would the search engine see that? Or, would this be considered JavaScript created content? Thank you very much for your help!
-
You may have some issues with this setup, from past research, search engines generally don't call JavaScript, if they don't see any content on the page, they may read the JavaScript to determine if there are any visible URL's to follow, and then follow them.
Unlike a DB where variables are carried in the URL, using JavaScript to call or load data can have it's drawbacks. There are no absolutes though.
Just think of a spider like a browser that is only looking for content. Turn off JavaScript, and visit the page, and see the results.