Saturday, February 09, 2008

Free light antivirus applications for personal use

Most people usually complain of antivirus software slowing down their computers. They look for a fast scanning & reliable antivirus program that won't miss new virus threats, already combined with an integrated shield protecting their system from further infection.
This time I'll present you the 3 free applications: Spyware Terminator anti-spyware, Avira as well as 360 Security antivirus
For more antivirus comparisons, you can take a look at this Windows 10 course


spyware terminator

http://www.spywareterminator.com
After playing with I will share the features of Spyware Terminator which I've found useful:
It has a light and almost intuitive easy to use interface. Virus scanning is fast and the program virus definitions are updated daily supporting peer-to-peer downloads.   Program's executable takes very little memory fingerprint which when compared with Norton Antivirus, BitDefender, Avast, Kaspersky(KAV) and Nod32 the program takes the smallest amount of computer's memory.
The integrated real-time antivirus shield functions nicely and actually stop virus infections. The program catches objects (spyware, adware, trojans & viruses) better than both SpyBot and AdAware.
Once finished with full spyware and virus scanning Spyware terminator enables its unique HIPS protection. HIPS is short for Host Intrusion Prevention System - a system that maps the local files on your hard drive. Then if an unknown virus is found the antivirus easily restores the infected file without damaging it. There's no need to worry: the whole mapping will not slow your computer or take a lot of your hard disk drive space.
Also, a nice feature of the program is that it manages and uninstalls easily the resident(remaining in computer's memory) viruses and trojans even the one spawned from processes such as explorer.exe without a need of system restart.
After each cleaning process, you'll see that Spyware Terminator creates a clean windows restore point. This allows you to get your system back from times when the computer has been in good shape.
In case you find unknown adware, virus or trojan you can send the infected file to the spyware terminator labs for an examination - either with the application itself or using their website forum. In a few weeks, the updated antivirus definitions will be released thus fixing your problem.
Playing games, installing software
You can switch off Internet Shield mode as well as Internet guard to receive less notifying pop-ups from the application when installing new trusted software.

AVIRA
For ensuring more serious protection, I would also recommend using the free version of Avira Antivir, which has a residential shield, occupies a small amount of system memory and constantly checks its database server for new updates. An interesting part of Avira is that during installation it will warn you if there are other installed/incompatible antivirus versions on your computer.

http://www.avira.com/en/free-antivirus-windows

 Last but not least I would like to present to you the most powerful one which is called 360 total security.
It can be downloaded for free here: http://www.360totalsecurity.com/

http://static.ts.360.com/home/images/home/screenshot.en-6b734003.jpg

The antivirus program has an exceptionally good database against viruses, also lots of free additional tools for speeding up the computer, cleaning up registries as well as old windows backup files. Not to mention its useful residential shield.

And please don't forget
After downloading and installing any antivirus software, please update program's definitions usually from menu accessible with right on the application's taskbar icon.

Cheers!

Sunday, April 15, 2007

Deoptimising - a new way of SEO

I'll start this post with the assumption that you could have experienced problems with the Google ranking algorithms.
Let's say you are writing new content and for certain keywords instead of first it appears immediately on the 3rd page or even at the end of search results. A possible reason might be that your website could have lost its trust-rank and have the so-called 950 penalty applied at runtime by Google. In order to restore our rankings here is what you can do. And in the meantime, you can learn a bit more about the SEO topic from my online course.

Check whether you need to deoptimize:
If you want to see all the non-supplemental pages from your just type in google:
site:www.yourwebsite.com/*

Then to see just the supplemental pages from your site type:
site:www.yourwebsite.com/ -site:www.yourwebsite.com/*

Keep the ratio below 50%

Also try the automated supplemental ratio tool: http://www.mapelli.info/tools/supplemental-index-ratio-calculator

What to do next:
1. Validate your website

2. If you use plenty of H1, H2, H3 tags remove most of them or replace 'H1' with stylized 'H2' or 'span' and 'strong' tags.

3. Don't use same data in: 'title', 'h1', 'meta description' tags.

4. In your website, inner-linking navigation uses the same linking structure. Using the same 'title' attributes in the menu on every page of your site is considered spam.

5. Your affiliate/referral links should differ in the anchor text. Please check the 'title' attributes to be unique and avoid keywords stuffing there.
Pay special attention to 'title' and 'alt' attributes: if they are overstuffed google bot will just place the first few lines of your page as a description in its search results which turns to be your repeating website heading information.
Solution: examine what your search results look like(site:http://www.yourwebsite.com), see what exactly indexes google and make according to changes ie. reduce the 'title' attributes.

6. Remove any static content from the bottom page of your website especially the outbound links, etc...

7. Check your affiliate links whether they are thematic or not. Remove those that are not connected to your site theme or add rel="nofollow" to them. If your site displays RSS feeds be sure to add rel="nofollow" also.
Update: Try this tool to find whether you are linking to bad neighborhood websites: http://www.bad-neighborhood.com/text-link-tool.htm

8. Lower your content keyword density
http://www.webconfs.com/keyword-density-checker.php
Keyword density is an important factor if you're serious in SEO. Once you have a great kind of content, the crucial part is to be able to present it in front of the right public.
And by having keyword density above, for example, the threshold of 2 - 4% will mark your content as thin and it won't compete/show with other websites in SERPS.

- check out your navigation (posts archive) - too many links to posts with keywords in their titles increase the overall post keyword density so be careful.
- when using forms you may also look over your hidden field values: do not use keywords there - it's an easily misunderstood issue.

- also, don't forget to check your content for being detected as potential spam:
http://tool.motoricerca.info/spam-detector/

9. Limit the usage of 'title' attributes in the <a href> tags as well as <b>/<strong> tags -> they weigh to the content presented in the SERPS.

These might sound like drastic measures but I've already managed to escape 3 websites using the above techniques. So experiment and look at what will happen. Wait and hopefully soon you'll be out of google's supplemental index too.

Load your navigation/advertising section using AJAX request not purely via JavaScript.

Check google's webmaster's tools and fix if there are any potential duplicate issues.

Check all of your sub-domains for supplemental results and fix them as soon as possible.
You know the benefits of organic SEO long-lasting effect versus the link - driven short term success. Here are some simple steps for your website to ensure a long term quality traffic flow from happy visitors.

My websites - full with unique content and constantly expanding, had a problem - plenty of pages gradually went to the supplemental index (ie. 50 out of 500 results were in the main index). After lots of experiments and reading below are my guidelines on how to organically do an on-page optimization or how easy to get more of your content indexed:

Paginated results
Ensure unique meta description wherever you can on your website, even on the paginated results:  If you have an article with lots of comments - on the 2nd onward comments page strip the article text leaving just the comments. This way you'll create a brand new unique content page, just like in forums.

Repetition and bold text
Put special attention to em (italic) and bold tags - they add weight to the web page and if repeated through the pages, they could trigger google's penalty filter. Remove repeated word occurrences such as: Reply to this comment, Vote, etc... - replace such text using unobtrusive JavaScript.

Unique heading and meta descriptions
Look especially at the headings like H1, H2, and make sure that they are unique and not repeating.

Loading time
Improve loading time: inspect page loading time with Yahoo's YSLOW and Google's PageSpeed browser addons and try to make most of the suggested improvements. You can also press F12 to open up Developer Tools in Chrome or Firefox and inspect your content from the network tab to detecting slow-loading elements:


let's recap on the main speed improvements how:
- make cache version of your pages
- use asynchronous Google analytics and social sharing buttons such as facebook, twitter, etc.
- place all your JavaScripts at the bottom of your page - this way the page content will load first.
- gzip your CSS and js files
- beware of WordPress wp-cron.php file - it hogs the system CPU resources down and might get you banned from your hosting provider: just rename it or find where is used and disable all calls(includes) to this file, etc...).

Blogger users
If you use Blogger's hosting use this sitemap tool and provide the generated sitemap in Google's Search console. Benefits are that this way you can submit your all posts for indexing.

Canonical urls
Check whether your website is listed in SERPS listings both via http://yourwebsite.com , http://www.yourwebsite.com or http://yourwebsite.com/index.php

If that's the case you'll have to:
1. Manually rewrite all your: internal links to the already indexed/preferred (www or non-www) version.
2. Permanently redirect using .htaccess mod_rewrite your http://yourwebsite.com/index.html or http://yourwebsite.com/index.php page to the root domain of your indexed website URL (i.e http://www.yourwebsite.com/).

Link weight

Use Supplemental Results Detector http://www.seo4fun.com/php/pagerankbot.php to distribute evenly link weight between your pages.

Source Ordered Content
Display your content first to search engines via CSS - especially true for the new Panda update.

Log your results
Write in a text file the date on which you make changes, then check your statistics the next week to determine whether they are beneficial or not.

Sunday, March 04, 2007

Acne - clean your liver

Acne - liver connection
During puberty toxin levels in the body are higher because of its increased metabolism and rapid growth. At that time the liver( as the main filtering organ) can easily become overwhelmed. When that happens, body triggers its natural response of flushing out the toxins through the skin resulting in an acne breakout. One way to restore the liver's normal functioning is via cleanse.

Daily cleanse recipe
If you want to purify and detoxify your body from inside here is a recipe that you can do daily until you notice further improvement. It's nothing more than a salad dressing so it's harmless, unless you've got some serious liver issues.

Put the following ingredients into a glass:

1 glass of water
1 tablespoon of cold pressed virgin olive oil
juice of one lemon
2 garlic cloves
pinch of ginger

Stir or blend the mixture and drink on an empty stomach. Also if you’ve acidified your body’s Ph due to stress, negative emotions or improper diet, the drink will balance it back to normal(due to the lemon's alkaline properties). Proper alkalization of the whole body Ph, takes around 30 days, combined with green juices like Spirulina.

Fibre
Also, take additional fibre to introduce more good bacteria in your stomach. Fibres keep the toxins and cholesterol from being reabsorbed into the blood and help to be excreted normally. Good source of fibre are the probiotic cultures of Lactobacillus Bulgaricus and Lactobacillus Bifidus found in yoghurt.

Don't forget to drink 8 glasses of water daily. On the 5-6day you'll notice the difference. The receipt has also a beneficial effect on backne.


Additionally
Here is a list of herbs that might relieve your symptoms caused by improper liver_functioning. You could prepare herbal tea or an extract out of them. And please consult a qualified physician before consumption.

Dandelion root & leaves, Burdock root, Senna Alexandrina, Berberis Vulgaris or European Barberry, Cascara Sagrada, Cayenne Pepper, Fennel, Aloe Vera, Rhubarb, Marjoram, Cat's Claw, Comfrey(Symphytum), Chicory, Goldenrod(Solidago), Hortensis Root, Marshmallow Root, Bear's Grape, Thistle, Hydrastis, Liquorice, Glycyrrhiza glabra...

Monday, February 05, 2007

Clear acne with honey

Acne is a skin problem and usually begins in both sexes during puberty. One of the reasons that cause acne is the hormonal change that occurs at this age. It is often being observed as black spots on the face, neck and back skin pores.
Often, followed by an additional infection acne leads to the formation of pus pimples.
Acne is stressful for kids, as in this period of rapid growth psychological changes related to anger and hot-tempered are usually observed.

Careful when treating!
When treated improperly acne can lead to serious disturbances. So it is prohibited to squeeze the pimples, especially those residing on the face. Young people must know that such actions could affect the central body's nervous system and lead to meningitis.
The underlining reason behind this is that the skin veins flow into the brain vessels called sinusitis. This way when psychically pressing the pimples the pus can easily go from the veins to the sinuses and then to the structural brain veins.
Incorrect skin treatment also hides another problem - an infection called sepsis which affects the whole organism. Both meningitis and sepsis are life-threatening, but it is important to know that when treated properly the Vulgaris and Juvenilis forms of acne could be easily avoided.

What honey products give good results when treating acne?
Masks with pure nectarine bee honey
It is recommended to buy honey directly from the producer and not form the store. Also do not try to liquidize an already crystallized one, because this might diminish its healing properties. Just know that in case of skin contact at a temperature of about 36C degrees honey's thin layers will instantly liquidize and provide all of its benefits.

The procedure
Here are three honey procedures that will make your skin clear and beautiful!
  • Wash your skin with water and mild soap. It is better if the soap is honey-enriched. Dry carefully without any irritation and put a thin layer of honey upon. Wait 1 hour and gently wash with warm water. Do 2 procedures like this daily for 15 days.

  • Pimples could also be treated for a 15 day period with a 30% spirit solution of propolis.

  • Boys: 3times daily take 1 teaspoon of honey mixed with honey bloom(1kg:200gr.) 1 hour before lunch

    Girls: 3times daily take 1 teaspoon of honey mixed with honey milk(1kg:10gr.) 1 hour before lunch

Types of honey
Here is a list of popular honey types and the reasons why they are so beneficial:
  • Lime tree honey offers first-class quality - it has a light yellow color, easily goes to crystals and has a specific aroma. It is used for inhalation when fighting respiration problems. It also has a positive effect on the intestines and kidneys.

  • Acacia honey is considered as one of the best honeys - transparent, light, more liquid and with the specific acacia scent. It's suggested when having heart, intestinal problems, and inflammations occurring in women.

  • Poly floral field honey has light-amber color giving you tender aroma and taste. Another one in the family famous for its healing qualities.

  • Monofloral meadow - subdivided into transparent and semi-transparent and is used effectively in gynecology and diets.

  • The sunflower type has saturated, almost brown color with unique aroma and bitterness.

  • The tee colored chestnut type comes with a tart taste. This honey causes a stimulating effect and is also known as the man's honey because of its aphrodisiac features.
honey
Did you know?
Inverted sugar: In fact, this is the sum of glucose and fructose, responsible for honey's quality. Best quality honey is a mix variety one having inverted sugar content of up to 80%.
Saccharose: is the plain sugar gained from sugar beet and sugar cane. Honey with an increased quantity of saccharose is a low value one. High levels of saccharose could be observed when the bees are being fed with sugar syrup, as well as when their saliva glands responsible for producing sugar from nectar are being exhausted.

Friday, January 12, 2007

Duplicate website content and how to fix it

Here I'll present a practical way on how to avoid the duplicate content penalty.

When is this penalty applied?
This kind of penalty is applied by search engines such as Google when there is an indication of two exactly the same versions of your site's content.

How can your website become a victim of such a penalty?
The modern content management systems(CMS) and community forums offer numerous possibilities of managing new content, but because of their deep structure, their URLs are very long. So search engines are unable to fully spider the site.
The solution to webmasters was to rewrite the old URL so index.php?mode=look_article&article_id=12 URL now becomes just article-12.html. As a first step, it serves its purpose, but if left like this the two URLs are going to be indexed. If we look through the eyes of a search engine we'll see same content having 2 instances and of course, the duplicate filter is raised:
I-st instance: index.php?mode=look_article&article_id=12

II-nd instance: article-12.html
Easy solution
The solution is done via the PHP language and using .htaccess Apache file.
First off we'll rewrite our URLs so they can be search-friendly. Let's assume that we've to redirect our index.php?mode=look_article&article_id=... to article-....html

Create an empty .htaccess file and place this. First, edit the code and fill in your website address. If you don't have subdomain then erase the subdomain variable also.
RewriteEngine on

RewriteRule article-([0-9]+)\.html    http://www.yourwebsite/subdomain/index.php?mode=look_article&article_id=$1&rw=on

RewriteCond %{the_request} ^[A-Z]{3,9}\ /subdomain/index\.php\ HTTP/
RewriteRule index\.php http://www.yourwebsite/subdomain/ [R=301,L]

RewriteCond %{HTTP_HOST} .
RewriteCond %{HTTP_HOST} ^www\.yourwebsite\.subdomain [nc]
RewriteRule ^(.*)$ http://yourwebsite/subdomain/$1 [R=301,L]

Explanation:
  • RewriteRule article-([0-9]+)\.html http://www.yourwebsite/subdomain/index.php?mode=look_article&article_id=$1&rw=on
    Those lines allow article-12.html to be loaded internaly as index.php?mode=look_article&article_id=12
    The variable &rw=on is important for the later PHP code. So don't forget to include it.
  • RewriteCond %{the_request} ^[A-Z]{3,9}\ /subdomain/index\.php\ HTTP/
    RewriteRule index\.php http://www.yourwebsite/subdomain/ [R=301,L]
    These lines avoid considering index.php as a separate page thus lowering your website PR and will transfer all the PR from index.php to your domain.
  • RewriteCond %{HTTP_HOST} .
    RewriteCond %{HTTP_HOST} ^www\.yourwebsite\.subdomain [nc]
    RewriteRule ^(.*)$ http://yourwebsite/subdomain/$1 [R=301,L]
    This will avoid duplicate URLs such as www and non-www and transfer all the requests and PR to the non-www site.

Then create file header.php and include in your website before all other files:

Put there:

$rw=$_GET['rw'];
if ($rw=="on") { echo "<meta content=\"index,follow\" name=\"robots\" />"; }

else { echo "<meta content=\"noindex,nofollow\" name=\"robots\" />"; }

This will point the search engine to index only the pages that will have rw flag set to on. These pages will be the previous set like article-12.html pages.

Of course, if you have access to your robots.txt file and to your root domain then you can just put the file: look_article there and you are done:
User-agenta: *

Disallow: /look_article.php



Notes: For those using CMS - check out whether your pages are still accessible using different parameters in the URL
Example: you've deleted an article with id=17 but the empty template would be still accessible producing header status 200 OK code - this will be surely recognized as a thin content from Google.
Solution:
1. Find out those empty pages and give them header status 404 not found code:

header("Status: 404 Not Found");


2. Create error404.html file explaining that the user is trying to access a non-existent page.

3. Then add in your .htaccess file the custom 404 error page:
ErrorDocument 404 /your_domain_name/error404.html

This way the search engine spider won't penalize your template displaying empty information - it will now see those pages like a 404 not-found document.

The next step involves cleaning up an already indexed but duplicated website content in order to regain the search engine's trust.

Above is a sample screenshot is taken from Google Search Console Keywords report. You may ask why should we need it when we can use Firefox integrated Show keyword density function?
Well, one benefit is that this function shows specific keyword significance across your pages. Let me explain what do this means:
Suppose that you are optimizing content for the keyword 'cars'. It's a normal practice to repeat 'cars' 2-3 times, style it in bold, etc... Everything's good as long as you do it naturally. The moment you overstuff your page with this keyword it will get penalized and lose its current ranking in Google SERPS. So you have to be careful with such repetitions.Moreover in the report, you can see the overall website keyword significance. And because Google likes thematic websites it is really important for these keywords to reflect your website purpose or theme. Otherwise, you're just targeting the wrong visitors and shouldn't be puzzled by the high abandonment rate.

But, enough with the theory now let's discuss how you can fix some things up:

Check every individual webpage keyword density online via Webconfs and correct(reduce) words, that are being used over 2%. Again this % depends on your local keyword concurrency. So tolerable levels can vary up and down.

Add the 'canonical' tag to all your website pages:
< link rel="canonical" href="http://www.example.com/your_preferred_webpage_url.html" />
(and make sure to specify the URL that you really prefer!). This will reveal to the search engine what your legitimate webpage is. More info: http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html



Blogger users can achieve adding canonical with the following code at the head section in the Template:
<b:if cond='data:blog.pageType == "item"'>
<link expr:href='data:blog.url' rel='canonical'/>
</b:if>
(it will remove the parameters appended at the end of the URL such as http://nevyan.blogspot.com/2016/12/test.html?showComment=1242753180000
and specify the original authority page: http://nevyan.blogspot.com/2016/12/test.html )

Next to prevent duplicate references of your archive( i.e .../2009_01_01_archive.html) and label pages( i.e. /search/label/...) from getting indexed just add:
<b:if cond='data:blog.pageType == "archive"'>
<meta content='noindex,follow' name='robots'/>
</b:if>
<b:if cond='data:blog.pageType == "index"'>
<b:if cond='data:blog.url != data:blog.homepageUrl'>
<meta content='noindex,follow' name='robots'/>
</b:if>
</b:if>

To prevent indexing of mobile (duplicates) of the original pages:
    <b:if cond="data:blog.isMobile">
<meta content='noindex,nofollow' name='robots'/>
</b:if>

And working solution blocking even the /search/tags from indexing, allowing only homepage and posts to be indexed:
    <b:if cond="data:blog.pageType == "index" and data:blog.url != data:blog.homepageUrl">
<meta content='noindex,follow' name='robots'/>
</b:if>

Thursday, December 28, 2006

How to recover deleted files on formatted drive?

Here I'll show you how to use the File Scavenger program that allows you to recover permanently lost files on your hard drive. The application size is small but it's very effective in restoring deleted information. It scans your hard drive and detects if there are ones that could be restored.

Recently I've had to restore a 60Gb HDD that was 2 times formatted using NTFS quick and full windows installation format and simultaneously after the formats were installed 2 versions of Windows XP.
I've tried lots of recovery software on this drive to see what results they could bring out and I think that this program deserves special attention.
As an end result, I've managed to rescue 35Gb of music and photos from the hard drive.




Usage:

First off ensure that you've got a spare hard disk with a size similar to the damaged/deleted/formatted one.
Example: If you want to restore 40Gb of information ensure another drive with 40Gb free space.
Get a license because otherwise, you could recover only the starting chunks from the deleted files.

Start the File Scavenger. On the Search mode select Long Search and press the Search button. On the warning message press: No. Display deleted files. Then you'll have to wait for some time until the program finishes rescuing.
In the input box Recover using the Browse... button select the free harddrive already prepared for the information storage. You'll have to wait for this process to finish. When ready you'll have your information back and all you'll have to do is to sort it because the file names will be renamed.

Happy and successful restoring and be sure to check my Windows 10 course Cheers!

Thursday, December 21, 2006

Free website click heatmap - DIY

What are the click heat maps?
Heat maps are places on your website where users mostly click or hover their mouse. They are also known as website hot spots.
If you want to learn more about the JavaScript techniques you can visit the JavaScript course

 
Why do you need to know your website hotspots?

Once knowing their location you can reorder your important information, optimize your adverts to increase their CTR(click-through-rate), etc...

Remember that this type of click tracking is different from the simple web counters. Click heatmaps map the exact x,y clicks, and mouse hover positions. That makes such statistics really great for those who look for usability improvements.

Of course, there are free services such as Creazyegg.com and Clickdensity.com but the offered JavaScript actually slows down the user experience. So the following is a guide on on how to build your own heatmap tracking. I've combined 2 techniques so the AJAX script could work across multiple domains. Here are their respected versions URLs:


for the heatmap:
http://blog.corunet.com/english/the-definitive-heatmap

for the cross-domain AJAX:
http://blog.cbciweb.com/articles/tag/asynchronousjavascriptacrossdomains

INSTRUCTIONS:
1. Create an empty readable and writeable file: clickcount.txt on your webserver.
2. Place the following javascript code in your website just above the ending tag:

var xOffset,yOffset;
var tempX = 0;
var tempY = 0;
 
 
//detect browser
var IE = document.all?true:false

//find the position of the first item on screen and store offsets
//find the first item on screen (after body)
var firstElement=document.getElementsByTagName('body')[0].childNodes[1];
//find the offset coordinates
xOffset=findPosX(firstElement);
yOffset=findPosY(firstElement);
if (IE){ // In IE there's a default margin in the page body. If margin's not defined, use defaults
var marginLeftExplorer  = parseInt(document.getElementsByTagName('body')[0].style.marginLeft);
var marginTopExplorer   = parseInt(document.getElementsByTagName('body')[0].style.marginTop);
/*assume default 10px/15px margin in explorer*/
if (isNaN(marginLeftExplorer)) {marginLeftExplorer=10;}
if (isNaN(marginTopExplorer)) {marginTopExplorer=15;}
xOffset=xOffset+marginLeftExplorer;
yOffset=yOffset+marginTopExplorer;
}
/*attach a handler to the onmousedown event that calls a function to store the values*/
document.onmousedown = getMouseXY; 

/*Functions*/
/*Find positions*/
function findPosX(obj){
var curleft = 0;
if (obj.offsetParent){
while (obj.offsetParent){
curleft += obj.offsetLeft
obj = obj.offsetParent;
}
}else if (obj.x){
curleft += obj.x;
}
return curleft;
}
 
 
function findPosY(obj){
var curtop = 0;
if (obj.offsetParent){
while (obj.offsetParent){
curtop += obj.offsetTop
obj = obj.offsetParent;
}
}else if (obj.y){
curtop += obj.y;
}
return curtop;
}
function getMouseXY(e) {
if (IE) {
tempX = event.clientX + document.body.scrollLeft
tempY = event.clientY + document.body.scrollTop
} else {
tempX = e.pageX
tempY = e.pageY
}
tempX-=xOffset;
tempY-=yOffset;

var url='http://yourwebsite.com/scripts/empty.php' /* Type your website URL here*/
url = url + '?x='+tempX+'&y='+tempY;
ajad_send(url);
 
return true;
}
 
var ajad_ndx_script = 0;
 
function ajad_do (u) {
// Create new JS element
var js = document.createElement('SCRIPT');
js.type = 'text/javascript';
ajad_ndx_script++;
js.id = 'ajad-' + ajad_ndx_script;
js.src = u;
 
 
// Append JS element (therefore executing the 'AJAX' call)
document.body.appendChild(js);
return true;
}
 
function ajad_get (r) {
// Create URL
// Do AJAD
return ajad_do(r);
}
  
function ajad_send(url) {
// referrer
// send it
ajad_get(url);
 
// remove the last script node.
document.body.removeChild(document.getElementById('ajad-' + ajad_ndx_script));
ajad_ndx_script--;
}



3. Create empty.php file and fill it with:
<?php
$q=$_SERVER['REQUEST_URI'];
include_once("functions.php");
save_file($q, "clickcount.txt");
?>


4. Statistics
Allow at least 2 days for the clicks to accumulate. If you open clickcount.txt in your browser you'll see the x,y coordinates of the user clicks.
In order to visualize the gathered data create file click_count.php with the following contents:

<?php
Header("Content-Type: image/png" );
$width=1024;
$height=4300;

$im=ImageCreate($width,$height);
$red = ImageColorAllocate ( $im, 255, 0, 0 );
$white = ImageColorAllocate ( $im , 255, 255, 255 );
$black = ImageColorAllocate ( $im ,0, 0, 0 );
$blue = ImageColorAllocate ( $im , 0 , 0 , 255 );
$gray = ImageColorAllocate ( $im , 0xC0, 0xC0 , 0xC0 );
ImageFill ( $im , 0 , 0 , $black );
$file="clickcount.txt";
$fp = fopen ( $file, 'r' ) or die("error opening file");
$file=fread($fp,filesize($file));
$splitted  = explode ("\n", $file);
foreach ($splitted as $split) {
if (empty($split) || !is_string($split)) continue;
$url=parse_url($split);

if (!empty($url['query'])) {
parse_str($url['query'],$vars);
$x = $vars['x'];
$y = $vars['y'];
//imagesetpixel($im,$x,$y,$white);
// draw  white ellipse
$col_ellipse = imagecolorallocate($im, 255, 255, 255);
imagefilledellipse($im, $x, $y, 10, 10, $col_ellipse);
}

}

Imagepng ($im);
imagedestroy($im);
?>


?>


website heat map

Here is a sample generated heatmap screenshot: Click to see the whole image.
heat map

Enjoy the course! I k!ow that the above mentioned code could be optimized further so your suggestions are always welcome!

Subscribe To My Channel for updates