Monday, February 05, 2007

Clear acne with honey

Acne is a skin problem and usually begins in both sexes during puberty. One of the reasons that cause acne is the hormonal change that occurs at this age. It is often being observed as black spots on the face, neck and back skin pores.
Often, followed by an additional infection acne leads to the formation of pus pimples.
Acne is stressful for kids, as in this period of rapid growth psychological changes related to anger and hot-tempered are usually observed.

Careful when treating!
When treated improperly acne can lead to serious disturbances. So it is prohibited to squeeze the pimples, especially those residing on the face. Young people must know that such actions could affect the central body's nervous system and lead to meningitis.
The underlining reason behind this is that the skin veins flow into the brain vessels called sinusitis. This way when psychically pressing the pimples the pus can easily go from the veins to the sinuses and then to the structural brain veins.
Incorrect skin treatment also hides another problem - an infection called sepsis which affects the whole organism. Both meningitis and sepsis are life-threatening, but it is important to know that when treated properly the Vulgaris and Juvenilis forms of acne could be easily avoided.

What honey products give good results when treating acne?
Masks with pure nectarine bee honey
It is recommended to buy honey directly from the producer and not form the store. Also do not try to liquidize an already crystallized one, because this might diminish its healing properties. Just know that in case of skin contact at a temperature of about 36C degrees honey's thin layers will instantly liquidize and provide all of its benefits.

The procedure
Here are three honey procedures that will make your skin clear and beautiful!
  • Wash your skin with water and mild soap. It is better if the soap is honey-enriched. Dry carefully without any irritation and put a thin layer of honey upon. Wait 1 hour and gently wash with warm water. Do 2 procedures like this daily for 15 days.

  • Pimples could also be treated for a 15 day period with a 30% spirit solution of propolis.

  • Boys: 3times daily take 1 teaspoon of honey mixed with honey bloom(1kg:200gr.) 1 hour before lunch

    Girls: 3times daily take 1 teaspoon of honey mixed with honey milk(1kg:10gr.) 1 hour before lunch

Types of honey
Here is a list of popular honey types and the reasons why they are so beneficial:
  • Lime tree honey offers first-class quality - it has a light yellow color, easily goes to crystals and has a specific aroma. It is used for inhalation when fighting respiration problems. It also has a positive effect on the intestines and kidneys.

  • Acacia honey is considered as one of the best honeys - transparent, light, more liquid and with the specific acacia scent. It's suggested when having heart, intestinal problems, and inflammations occurring in women.

  • Poly floral field honey has light-amber color giving you tender aroma and taste. Another one in the family famous for its healing qualities.

  • Monofloral meadow - subdivided into transparent and semi-transparent and is used effectively in gynecology and diets.

  • The sunflower type has saturated, almost brown color with unique aroma and bitterness.

  • The tee colored chestnut type comes with a tart taste. This honey causes a stimulating effect and is also known as the man's honey because of its aphrodisiac features.
honey
Did you know?
Inverted sugar: In fact, this is the sum of glucose and fructose, responsible for honey's quality. Best quality honey is a mix variety one having inverted sugar content of up to 80%.
Saccharose: is the plain sugar gained from sugar beet and sugar cane. Honey with an increased quantity of saccharose is a low value one. High levels of saccharose could be observed when the bees are being fed with sugar syrup, as well as when their saliva glands responsible for producing sugar from nectar are being exhausted.

Friday, January 12, 2007

Duplicate website content and how to fix it

Here I'll present a practical way on how to avoid the duplicate content penalty.

When is this penalty applied?
This kind of penalty is applied by search engines such as Google when there is an indication of two exactly the same versions of your site's content.

How can your website become a victim of such a penalty?
The modern content management systems(CMS) and community forums offer numerous possibilities of managing new content, but because of their deep structure, their URLs are very long. So search engines are unable to fully spider the site.
The solution to webmasters was to rewrite the old URL so index.php?mode=look_article&article_id=12 URL now becomes just article-12.html. As a first step, it serves its purpose, but if left like this the two URLs are going to be indexed. If we look through the eyes of a search engine we'll see same content having 2 instances and of course, the duplicate filter is raised:
I-st instance: index.php?mode=look_article&article_id=12

II-nd instance: article-12.html
Easy solution
The solution is done via the PHP language and using .htaccess Apache file.
First off we'll rewrite our URLs so they can be search-friendly. Let's assume that we've to redirect our index.php?mode=look_article&article_id=... to article-....html

Create an empty .htaccess file and place this. First, edit the code and fill in your website address. If you don't have subdomain then erase the subdomain variable also.
RewriteEngine on

RewriteRule article-([0-9]+)\.html    http://www.yourwebsite/subdomain/index.php?mode=look_article&article_id=$1&rw=on

RewriteCond %{the_request} ^[A-Z]{3,9}\ /subdomain/index\.php\ HTTP/
RewriteRule index\.php http://www.yourwebsite/subdomain/ [R=301,L]

RewriteCond %{HTTP_HOST} .
RewriteCond %{HTTP_HOST} ^www\.yourwebsite\.subdomain [nc]
RewriteRule ^(.*)$ http://yourwebsite/subdomain/$1 [R=301,L]

Explanation:
  • RewriteRule article-([0-9]+)\.html http://www.yourwebsite/subdomain/index.php?mode=look_article&article_id=$1&rw=on
    Those lines allow article-12.html to be loaded internaly as index.php?mode=look_article&article_id=12
    The variable &rw=on is important for the later PHP code. So don't forget to include it.
  • RewriteCond %{the_request} ^[A-Z]{3,9}\ /subdomain/index\.php\ HTTP/
    RewriteRule index\.php http://www.yourwebsite/subdomain/ [R=301,L]
    These lines avoid considering index.php as a separate page thus lowering your website PR and will transfer all the PR from index.php to your domain.
  • RewriteCond %{HTTP_HOST} .
    RewriteCond %{HTTP_HOST} ^www\.yourwebsite\.subdomain [nc]
    RewriteRule ^(.*)$ http://yourwebsite/subdomain/$1 [R=301,L]
    This will avoid duplicate URLs such as www and non-www and transfer all the requests and PR to the non-www site.

Then create file header.php and include in your website before all other files:

Put there:

$rw=$_GET['rw'];
if ($rw=="on") { echo "<meta content=\"index,follow\" name=\"robots\" />"; }

else { echo "<meta content=\"noindex,nofollow\" name=\"robots\" />"; }

This will point the search engine to index only the pages that will have rw flag set to on. These pages will be the previous set like article-12.html pages.

Of course, if you have access to your robots.txt file and to your root domain then you can just put the file: look_article there and you are done:
User-agenta: *

Disallow: /look_article.php



Notes: For those using CMS - check out whether your pages are still accessible using different parameters in the URL
Example: you've deleted an article with id=17 but the empty template would be still accessible producing header status 200 OK code - this will be surely recognized as a thin content from Google.
Solution:
1. Find out those empty pages and give them header status 404 not found code:

header("Status: 404 Not Found");


2. Create error404.html file explaining that the user is trying to access a non-existent page.

3. Then add in your .htaccess file the custom 404 error page:
ErrorDocument 404 /your_domain_name/error404.html

This way the search engine spider won't penalize your template displaying empty information - it will now see those pages like a 404 not-found document.

The next step involves cleaning up an already indexed but duplicated website content in order to regain the search engine's trust.

Above is a sample screenshot is taken from Google Search Console Keywords report. You may ask why should we need it when we can use Firefox integrated Show keyword density function?
Well, one benefit is that this function shows specific keyword significance across your pages. Let me explain what do this means:
Suppose that you are optimizing content for the keyword 'cars'. It's a normal practice to repeat 'cars' 2-3 times, style it in bold, etc... Everything's good as long as you do it naturally. The moment you overstuff your page with this keyword it will get penalized and lose its current ranking in Google SERPS. So you have to be careful with such repetitions.Moreover in the report, you can see the overall website keyword significance. And because Google likes thematic websites it is really important for these keywords to reflect your website purpose or theme. Otherwise, you're just targeting the wrong visitors and shouldn't be puzzled by the high abandonment rate.

But, enough with the theory now let's discuss how you can fix some things up:

Check every individual webpage keyword density online via Webconfs and correct(reduce) words, that are being used over 2%. Again this % depends on your local keyword concurrency. So tolerable levels can vary up and down.

Add the 'canonical' tag to all your website pages:
< link rel="canonical" href="http://www.example.com/your_preferred_webpage_url.html" />
(and make sure to specify the URL that you really prefer!). This will reveal to the search engine what your legitimate webpage is. More info: http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html



Blogger users can achieve adding canonical with the following code at the head section in the Template:
<b:if cond='data:blog.pageType == "item"'>
<link expr:href='data:blog.url' rel='canonical'/>
</b:if>
(it will remove the parameters appended at the end of the URL such as http://nevyan.blogspot.com/2016/12/test.html?showComment=1242753180000
and specify the original authority page: http://nevyan.blogspot.com/2016/12/test.html )

Next to prevent duplicate references of your archive( i.e .../2009_01_01_archive.html) and label pages( i.e. /search/label/...) from getting indexed just add:
<b:if cond='data:blog.pageType == "archive"'>
<meta content='noindex,follow' name='robots'/>
</b:if>
<b:if cond='data:blog.pageType == "index"'>
<b:if cond='data:blog.url != data:blog.homepageUrl'>
<meta content='noindex,follow' name='robots'/>
</b:if>
</b:if>

To prevent indexing of mobile (duplicates) of the original pages:
    <b:if cond="data:blog.isMobile">
<meta content='noindex,nofollow' name='robots'/>
</b:if>

And working solution blocking even the /search/tags from indexing, allowing only homepage and posts to be indexed:
    <b:if cond="data:blog.pageType == "index" and data:blog.url != data:blog.homepageUrl">
<meta content='noindex,follow' name='robots'/>
</b:if>

Thursday, December 28, 2006

How to recover deleted files on formatted drive?

Here I'll show you how to use the File Scavenger program that allows you to recover permanently lost files on your hard drive. The application size is small but it's very effective in restoring deleted information. It scans your hard drive and detects if there are ones that could be restored.

Recently I've had to restore a 60Gb HDD that was 2 times formatted using NTFS quick and full windows installation format and simultaneously after the formats were installed 2 versions of Windows XP.
I've tried lots of recovery software on this drive to see what results they could bring out and I think that this program deserves special attention.
As an end result, I've managed to rescue 35Gb of music and photos from the hard drive.




Usage:

First off ensure that you've got a spare hard disk with a size similar to the damaged/deleted/formatted one.
Example: If you want to restore 40Gb of information ensure another drive with 40Gb free space.
Get a license because otherwise, you could recover only the starting chunks from the deleted files.

Start the File Scavenger. On the Search mode select Long Search and press the Search button. On the warning message press: No. Display deleted files. Then you'll have to wait for some time until the program finishes rescuing.
In the input box Recover using the Browse... button select the free harddrive already prepared for the information storage. You'll have to wait for this process to finish. When ready you'll have your information back and all you'll have to do is to sort it because the file names will be renamed.

Happy and successful restoring and be sure to check my Windows 10 course Cheers!

Thursday, December 21, 2006

Free website click heatmap - DIY

What are the click heat maps?
Heat maps are places on your website where users mostly click or hover their mouse. They are also known as website hot spots.
If you want to learn more about the JavaScript techniques you can visit the JavaScript course

 
Why do you need to know your website hotspots?

Once knowing their location you can reorder your important information, optimize your adverts to increase their CTR(click-through-rate), etc...

Remember that this type of click tracking is different from the simple web counters. Click heatmaps map the exact x,y clicks, and mouse hover positions. That makes such statistics really great for those who look for usability improvements.

Of course, there are free services such as Creazyegg.com and Clickdensity.com but the offered JavaScript actually slows down the user experience. So the following is a guide on on how to build your own heatmap tracking. I've combined 2 techniques so the AJAX script could work across multiple domains. Here are their respected versions URLs:


for the heatmap:
http://blog.corunet.com/english/the-definitive-heatmap

for the cross-domain AJAX:
http://blog.cbciweb.com/articles/tag/asynchronousjavascriptacrossdomains

INSTRUCTIONS:
1. Create an empty readable and writeable file: clickcount.txt on your webserver.
2. Place the following javascript code in your website just above the ending tag:

var xOffset,yOffset;
var tempX = 0;
var tempY = 0;
 
 
//detect browser
var IE = document.all?true:false

//find the position of the first item on screen and store offsets
//find the first item on screen (after body)
var firstElement=document.getElementsByTagName('body')[0].childNodes[1];
//find the offset coordinates
xOffset=findPosX(firstElement);
yOffset=findPosY(firstElement);
if (IE){ // In IE there's a default margin in the page body. If margin's not defined, use defaults
var marginLeftExplorer  = parseInt(document.getElementsByTagName('body')[0].style.marginLeft);
var marginTopExplorer   = parseInt(document.getElementsByTagName('body')[0].style.marginTop);
/*assume default 10px/15px margin in explorer*/
if (isNaN(marginLeftExplorer)) {marginLeftExplorer=10;}
if (isNaN(marginTopExplorer)) {marginTopExplorer=15;}
xOffset=xOffset+marginLeftExplorer;
yOffset=yOffset+marginTopExplorer;
}
/*attach a handler to the onmousedown event that calls a function to store the values*/
document.onmousedown = getMouseXY; 

/*Functions*/
/*Find positions*/
function findPosX(obj){
var curleft = 0;
if (obj.offsetParent){
while (obj.offsetParent){
curleft += obj.offsetLeft
obj = obj.offsetParent;
}
}else if (obj.x){
curleft += obj.x;
}
return curleft;
}
 
 
function findPosY(obj){
var curtop = 0;
if (obj.offsetParent){
while (obj.offsetParent){
curtop += obj.offsetTop
obj = obj.offsetParent;
}
}else if (obj.y){
curtop += obj.y;
}
return curtop;
}
function getMouseXY(e) {
if (IE) {
tempX = event.clientX + document.body.scrollLeft
tempY = event.clientY + document.body.scrollTop
} else {
tempX = e.pageX
tempY = e.pageY
}
tempX-=xOffset;
tempY-=yOffset;

var url='http://yourwebsite.com/scripts/empty.php' /* Type your website URL here*/
url = url + '?x='+tempX+'&y='+tempY;
ajad_send(url);
 
return true;
}
 
var ajad_ndx_script = 0;
 
function ajad_do (u) {
// Create new JS element
var js = document.createElement('SCRIPT');
js.type = 'text/javascript';
ajad_ndx_script++;
js.id = 'ajad-' + ajad_ndx_script;
js.src = u;
 
 
// Append JS element (therefore executing the 'AJAX' call)
document.body.appendChild(js);
return true;
}
 
function ajad_get (r) {
// Create URL
// Do AJAD
return ajad_do(r);
}
  
function ajad_send(url) {
// referrer
// send it
ajad_get(url);
 
// remove the last script node.
document.body.removeChild(document.getElementById('ajad-' + ajad_ndx_script));
ajad_ndx_script--;
}



3. Create empty.php file and fill it with:
<?php
$q=$_SERVER['REQUEST_URI'];
include_once("functions.php");
save_file($q, "clickcount.txt");
?>


4. Statistics
Allow at least 2 days for the clicks to accumulate. If you open clickcount.txt in your browser you'll see the x,y coordinates of the user clicks.
In order to visualize the gathered data create file click_count.php with the following contents:

<?php
Header("Content-Type: image/png" );
$width=1024;
$height=4300;

$im=ImageCreate($width,$height);
$red = ImageColorAllocate ( $im, 255, 0, 0 );
$white = ImageColorAllocate ( $im , 255, 255, 255 );
$black = ImageColorAllocate ( $im ,0, 0, 0 );
$blue = ImageColorAllocate ( $im , 0 , 0 , 255 );
$gray = ImageColorAllocate ( $im , 0xC0, 0xC0 , 0xC0 );
ImageFill ( $im , 0 , 0 , $black );
$file="clickcount.txt";
$fp = fopen ( $file, 'r' ) or die("error opening file");
$file=fread($fp,filesize($file));
$splitted  = explode ("\n", $file);
foreach ($splitted as $split) {
if (empty($split) || !is_string($split)) continue;
$url=parse_url($split);

if (!empty($url['query'])) {
parse_str($url['query'],$vars);
$x = $vars['x'];
$y = $vars['y'];
//imagesetpixel($im,$x,$y,$white);
// draw  white ellipse
$col_ellipse = imagecolorallocate($im, 255, 255, 255);
imagefilledellipse($im, $x, $y, 10, 10, $col_ellipse);
}

}

Imagepng ($im);
imagedestroy($im);
?>


?>


website heat map

Here is a sample generated heatmap screenshot: Click to see the whole image.
heat map

Enjoy the course! I k!ow that the above mentioned code could be optimized further so your suggestions are always welcome!

Sunday, December 03, 2006

Skin care - how to fight acne II

omega 3 capsuleIf you are deeply interested in fighting acne here is more advanced information that I’ve gathered.

The hormones
One of the main factors causing acne is the excess hormones in our body. When our hormones are out of balance they send signals to the respectful glands to produce additional amounts of oil in order to restore the balance.

During teenage times the androgen hormone levels are spiking. This causes an excess sebum oil production which at the end of its pathway clogs our hair follicles and thus contributes to the creation of acne.

Why getting sufficient essential fatty acids(EFA) is important
Recent studies show that consuming essential oils in the right proportions normalizes the levels of EFA's in our sebaceous glands. The same EFA's control the production of androgen hormones thus achieving a normal hormonal balance in our body.

Some facts about EFA
When our body has a deficit of essential fatty acids, we will have:
- a weaken immune system
- more inflammation
- poor skin
- skin eruptions that won’t heal easily
- increased sebum production
- increased sebaceous glands size

Benefits: These oils came straight from fish & vegetable oils, nuts, and seeds as well as additional supplements.

Balancing EFA’s
Prior to trying to manually balance your hormone levels, I would recommend you having hormonal tests. According to results, you'll see what needs to be adjusted.
The three fatty acids you need to balance out are the Omega-3, Omega-6, and Omega-9. Most people have an excess of Omega-6, so they need to concentrate on getting more Omega-3 EFAs to their diet. On the flip side if your ratio of Omega-3 is more than Omega-6, concentrate on supplementing with Omega-6 oils.

Use of flax seed oil
flax seedsIt’s less potent than the fish oil, so when one wants quicker results he/she could try first fish oil and Spirulina and then steadily settle down with flax oil. The problem with the flaxseed oil is that the flaxseed must be grounded before digestion and also the body needs an enzyme called delta-6 desaturase (D6D) to transform the flax seed's Linoleic acid into GLA.


spirulinaSpirulina is the most potent form of GLA because it is derived from marine or freshwater algae. Fish consuming such algae become more potent also. Spirulina, taken together with fish oil eliminates the need of D6 for your body in order to work fine. It's very high in protein and can also enhance your muscular strength and athletic endurance. Studies conclude that spirulina is effective in increasing isometric muscle strength, especially in trained athletes as well as enhancing isometric muscle endurance in trained and untrained participants alike.
A mix of protein, amino acids, and vitamin B12, spirulina is an excellent dietary supplement for vegetarians, due to the fact that 60 to 70% of the plant’s structure is composed of protein. It can also benefit you with an immune-enhancing effect.

Nature of the D6D enzyme
This enzyme is dependant on so many factors like minerals like magnesium, zinc, calcium, vitamins B6, B12, E, biotin and it is important to remember that D6D potency declines as we age. The delta-6-desaturase enzyme is also inhibited by stress, disease, increased insulin levels, trans-fatty acids (like those found in margarine), saturated fats and alcohol. When D6 is not so strong the digestion of flax seed gets compromised. The solution is to try the direct supply of gamma linoleum acid(GLA) from foods such as spirulina, borage, night primrose, and others.
Fixing the inhibition problem
We can further inhibit enzyme D5D that converts the GLA to the bad Arachidonic acid which causes inflammation on our body and converts the hardly gained GammaLinoleic acid back to Arachidonic acid. The solution is to eat sesame seeds. They inhibit the production of this D5D enzyme so more GammaLinoleic acid stays for the body.

GLA is then broken down either to arachidonic acid (AA) or another substance called dihomogamma-linolenic acid (DGLA). Dihomo-gamma-linolenic acid (DGLA) is the actual precursor in the production of prostaglandins - it is beneficial and reduces the inflammation in our bodies which is our goal. By consuming enough magnesium, zinc, vitamins: C, B3, B6 GLA will convert to DGLA and not to AA.

The anti-inflammatory Prostaglandins
Prostaglandins are hormones made from fatty acids (with the help of enzymes) that act as chemical messengers. They act as regulatory molecules and are responsible for the cell's inflammatory and anti-inflammatory responses. Derived from Omega-3 and Omega-6 oils they help to regulate every function in our cells and organs. They also keep our androgen hormones in control so that excess sebum won't be produced resulting in acne. Normally, enzymes in your body break Omega-3 acids down to Eicosapentaenoic Acid(EPA) and Docosahexaenoic Acid(DHA). These two fatty acids then change into prostaglandins.

When Omega-3 fatty acids are not enough

image of salmonThey consist of the following tree fatty acids: alpha-linolenic acid(ALA), eicosapentaenoic acid(EPA) and docosahexaenoic acid(DHA).
ALA created in the chloroplasts of green plants from linoleic acid(LA) is important because it serves as a building material for EPA. DHA on the other side is an integral part of eye and brain tissue. It is found in marine algae (spirulina), plankton, fish, and mammals.
However, eating plenty of essential Omega-3 and Omega-6 fatty acids (EFA) is not enough to produce EPA and DHA. It is critical for acne and more importantly for your health that you get enough EPA and DHA in your cells and organs that they can convert and produce the required prostaglandins. Then it is necessary for you to take a fish oil supplement that contains both the EPA and DHA fatty acids.
In order to be fully digested they must be taken after eating otherwise they'll quickly burn as energy fuel and you won't fully benefit them.

Notes on salicylic acid
The anti-inflammatory mechanism of the salicylic acid is due to decreased prostaglandin synthesis and possibly by inhibiting the synthesis and/or activity of other mediators of the inflammation response. The type of inflammation that results in symptoms of redness and swelling is actually the acne.

Omega sources reference list
omega sourcesOmega-3: Avocados, Sesame seeds, Pumpkin seeds, Walnuts, Dark leaf green vegetables (spinach, mustard greens, kale), Wheat germ oil, Salmon, Sardines, Albacore tuna

Omega-6: Flaxseed oil, Flax seeds, Grapeseed oil, Pistachio nuts, Olives, Olive oil, Sunflower seeds, Evening primrose oil, Pumpkin seeds, Black currant oil

Omega-9: Olive oil, Avocados, Cashews, Almonds, Olives, Sesame oil, Pecans, Pistachio nuts

Friday, November 24, 2006

Free dos antivirus scanners

Here are free anti-virus scanners and cleaners that you can run without the need to install. They are also suitable for your rescue flash disk drive.

F-Prot — Antivirus scanner and disinfectant (no longer available for dos)Download F-Prot Antivirus for DOS: http://files.f-prot.com/files/dos/f-prot.zip (backup link)
Extract the archive into a directory, then download and extract the antivirus signatures. When extracting overwrite or update the existing builtin F-Prot signature files.
Press Enter on SCAN menu then with the arrow keys Change Action: to Automatic disinfection and Files to Attempt to identify files. Then press START.



RHBVS (ROSE SWE's Heuristic Based Virus Scanner)
http://www.cfg2html.com/rose_swe/



Kaspersky Antivirus - resque disk
https://support.kaspersky.com/viruses/rescuedisk#downloads


DR.Web rescue disk:
http://www.freedrweb.com/livedisk/

Monday, January 02, 2006

How to get out of Google Sandbox

google logo
Ever wondered why a particular website might get fewer and fewer visits?
One reason for this might be that it is inside google's sandbox, so it gets no traffic from Google queries. In such situations, the following could be experienced:
1. Drop in the total number of website visitors coming from google.com.
2. A sudden drop in google's PageRank of all website pages.
3. When querying google on specific keywords - the website appears in the last 2-3 pages of the search results or is totally banned from google's listing.

How to check if the website is within Sandbox?

If you wish to check whether a sandbox has been applied to a particular website then try the following methods:

I Method
Use this web address to check the indexing of your website pages against a specific keyword:
http://www.searchenginegenie.com/sandbox-checker.htm
II Method
which is much more reliable, just 
run your web browser and go to http://www.google.com
then type in the search box:
www.yourwebsite.com -asdf -asdf -asdf -fdsa -sadf -fdas -asdf -fdas -fasd -asdf -asdf -asdf -fdsa -asdf -asdf -asdf -asdf -asdf -asdf -asdf
If your website appears in the search results and has good keyword ranking then your website is in google's sandbox.

III Method

Run your web browser, go to http://www.google.com and type:
site:www.yourwebsite.com
If there are no results found and then your website is out of google's indexing database. The difference between non-indexed fresh websites and sandboxed ones is that on the sandboxed you'll not see: If the URL is valid, try visiting that web page by clicking on the following link: www.yourwebsite.com

IV Method
When running google query then add at the end of the URL:
&filter=0

This will show all the results from the primary and supplemental google index of your website. If your website has been penalized then its results will reside in the supplemental index.

How to get your website out of Google's Sandbox

Next follows a guide on how to get one website out of google's sandbox having following techniques applied:

* Have a website structure not deeper than 3rd level (i.e: don't put content to be reachable via more than 3 links away, because the crawler/spider might stop crawling it. )

* rewrite the meta tags to explicitly manifest which pages should not be indexed. For this you should put in the header section of a page:
meta name="robots" content="index, follow" - for webpages that will be indexed
meta name="robots" content="noindex, nofollow" - for webpages that you don't want to be indexed
* delay the crawling machines
This is important especially if your hosting server doesn't provide fast bandwidth :
In your robots.txt file put:
User-agent: *
Crawl-Delay: 20

You can also adjust the Crawl delay time.
* remove the duplicate or invalid pages from your website that are still in google's index/cache: First prepare a list of all the invalid pages. Then use google's webpage about urgent URL removal requests:
https://www.google.com/webmasters/tools/url-removal
Ensure that those pages are no longer indexed, by typing your full website address in google with site:your_website.com. If there are no results this means that you've succeeded to get the pages out of google's index. It may sound strange, but this way you can reindex them again. When ready remove all the restrictions that you might have from .htaccess and webpage headers(noindex,nofollow)
Next, go to http://www.google.com/addurl/?continue=/addurl , put your website in the field for inclusion and wait for the re-indexing process to start.
During the waiting process, you can start getting links from forums and article directories to your quality content, which should point not only to your top-level domain but also to specific webpages.
For example: not only <a href="www.website.com"> but <a href="www.website.com/mywebpage1.html" >


* remove javascript redirects
Check whether are you using in your website meta-refresh javascript redirects. For example:
meta equiv="refresh" content="5; url=http://www.website.com/filename.php"
If so remove them, because they are assumed as spam by Google's Bot.
How: You can check your whole website by using the software as Xenu Link Sleuth
http://home.snafu.de/tilman/xenulink.html
Download and start the program. The whole process is straightforward - just type in the input box your website address
and start the check. (click on File->Check URL. That brings up a form for you to fill in with your website's URL).
This tool will check every page on your website and produce a report. In the report if you see 302 redirects - beware and try to fix them too. Using Xenu you could also check your website for broken links, etc.

* disavow 302 Redirects from other sites 
Check if websites linking to you give HTTP response code 200 OK
In google search box type allinurl: http://www.yoursite.com Then check every website other than yours by typing them here
http://www.webrankinfo.com/english/tools/server-header.php
and look for HTTP response code 200 OK.

If there are any that give 302 header response code then try to contact the administrator of the problematic website to fix the problem. If you think that they are stealing your Page Rank - report them to google report spam page
http://www.google.com/contact/spamreport.html
with a checkmark on Deceptive Redirects. As a last resort, you can also place the URL in google's disavow tool to clean up your backlink profile: https://www.google.com/webmasters/tools/disavow-links-main

For the next steps you will need access to your web server .htaccess file and have mod_rewrite module enabled in your Apache configuration:

* Make static out of dynamic pages
Using mod_rewrite you could rewrite your dynamic page URLs to look like static ones. So if you've got a dynamic .php page with parameters you could rewrite the URL to look like a normal .html page:

look_item.php?item_id=14

for the web visitor will become:

item-14.html
HOW: You have to add the following lines to your .htaccess file (placed in the root directory of your web server):
RewriteEngine on
RewriteRule item-([0-9]+)\.html http://your_website.com/sub_directory/look_item.php?item_id=$1 [L,R=301]
and transfer previously accumulated PR and backlinks
type in Google's search box:

site: http://your_website.com
This query will show all website indexed pages. Since you've been moving to search engine preferred (static .html) URLs it would be good to transfer the accumulated dynamic .php URLs PR and links to the corresponding static .html URLs. Here is an example about transferring a .php URL request to a static page URL of the web site.
HOW: Add the following line in your .htaccess file:

RewriteRule look_item\.php http://website.com/item.html [L,R=301]
where 301 means Moved Permanently, so the Search Engine Bot will map and use http://website.com/item.html instead of look_item.php as a legitimate source of information.

* Robots.txt check
In order to avoid Google from spidering both .html and .php pages thus assuming them as duplicate content which is bad, place a canonical tag for the proper web-page version that you prefer and empty your robots.txt file, so that google will consolidate both PHP and HTML pages into one preferred .html version: 


Important Note: If you have already .php files in the google index and don't want to use the canonical header tag, you can use meta attributes nofollow, noindex placed in the .php versions of the files, which requires little bit more effort.
* Redirect www to non-www URLs
Just check the indexing of your web site with and without the preceding "www". If you find both versions indexed, you are surely losing PageRank, backlinks and promoting duplicate content to Google. This happens because some sites can link to you with the http://www, and some prefer to use the pure domain version HTTP://. It's hard to control the sites linking to your website whether they link using "www" or "non-www". This time Apache Redirect and Rewrite rules come to help to transfer the www-URLs of your website to non-www URLs. Again to avoid PR loss and duplicates you will want your website URL to be accessible from only 1 location.
HOW: Place at the end of your robots.txt the following lines:
RewriteCond %{HTTP_HOST} ^www\.your_website\.com [nc]
RewriteRule (.*) http://your_website.com/$1 [R=301,L]
When finished with the redirect from www to non-www and https to http versions of your website or vice-versa, specify your preferred version in google webmaster tools. 

* Redirect index.php to root website URL
There is one more step for achieving non-duplicated content. You must point your index.html, index.htm, index.asp or index.php to the ./ or root of your website.
HOW: Insert in your robots.txt the following lines before the previous mentioned two lines:
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /subdomain/index\.php\ HTTP/
RewriteRule index\.php http://yourwebsite.com/subdomain/ [R=301,L]
Note: If your website is hosted under a subdomain fill its name in the /subdomain part. If not just delete the /subdomain . You can replace index.php with index.html, index.asp or whatever suits you.

* Have custom error 404 page:
in your .htaccess file type:
ErrorDocument 404 /your_website_dir/error404.html
Then create a custom webpage named error404.html to instruct the user what to do when came across a non-existent page. Then check if the 404 page actually returns 404 not found, and not 200 OK header status code.

Congratulations, by following those steps your website will be re-indexed soon. In a few days it will be re-crawled and out of the sandbox and possibly the advises below will help you to achieve better indexing for your website..

Cheers!

Subscribe To My Channel for updates