Pilot Flying J, a truckstop chain based out of Knoxville, TN, has hit the headlines repeatedly this year relating to federal fraud charges and a class action lawsuit. A recent 120-page affidavit filed in federal court claims that members of the company’s sales force scammed trucking companies by reducing or denying of rebates they were owed in a diesel fuel rebate scheme. Pilot J are still being investigated but there is no way to determine just how much money these companies were scammed out of. A Pilot Flying J class action lawsuit was penciled for November but the case is changing rapidly with new information being released almost daily. Many truckers are opting out of the class action and taking legal advice regarding independent legal action as the Pilot J Class action lawsuit is not perhaps their best chance of monetary damages. Back in April, Robert H. Root a special agent investigating the case, was extremely suspicious stating the scheme was:
“conspiracy and scheme to defraud executed by various Pilot employees to deceptively withhold diesel fuel price rebates and discounts from Pilot customers … for the dual purposes of increasing the profitability of Pilot and increasing the diesel sales commissions of the Pilot employees participating in the fraud.”
However, Haslam initially denied any wrongdoing in a previous news conference. Haslam stated:
“the foundation of this company is built on its integrity and that any willful wrongdoing by any employee of this company at any time is intolerable.”
Haslam also said the company would remain cooperative for authorities and conduct its own investigation as well. Now it looks like he is going to be pinpointed as a major player in a large scale federal fraud and other senior managers are being implicated with 7 guilty pleas received by the courts to date.
An employee acknowledged only as a confidential source, voiced to investigators that the rebate scheme was discussed during “back door” meetings attended by Haslam and Pilot President Mark Hazelwood. Nonetheless, Haslam claimed no wrongdoings on his company’s part, and has faulted other employees for the scam.
The informant had much to say regarding the practice, and that it was known by a range of rewordings ranging from “manual rebates” to “screwing.” Several other informants for the investigation have secretly recorded conversations among Pilot J employees holding frank -and often profane – discussions about the rebate scheme. Recent revelations including a videotaped meeting between senior Pilot J company executives confirmed the employee’s claims that this fraud was engineered and intentional.
The case continues and depositions are rolling in. Pilot Flying J is right in the thick of it now and thousands of angry truckers want their money back! If you are fighting to get your rebate money back, you will need legal assistance. Were you ripped off by them? If so you can opt out of the class action lawsuit and claim more monetary damages. http://pilotflyingjclassactionlawsuit.com is a site dedicated to helping those ripped off by Pilot Flying J to get their money back!
Facebook Graph Search
Facebook Graph Search is a new system currently in beta testing and rolling out to everyone in phases. Despite the confusing name, Graph Search is a way to get personalized results such as those you get from Google or Bing. The main difference is that the search results are far more contextualized in terms of your relationships and social circles, and the way that you interact with them.
The way it actually works is complex – even Facebook’s own help and the blog pages that cover Graph Search are not for the faint-hearted reader. If you can imagine a family tree that is built around your Facebook interactions and friends then attach everybody else’s family tree with all their friends and their own interactions (some of which will be the common to yours as you interact with them). This network is an extremely complex web of connections which Graph Search utilizes in order to deliver you the most appropriate search term results. All previous Facebook search engine mechanisms were based on keyword indexing; Facebook Graph Search is a more ethereal approach.
Although this may seem a little too in depth for a Facebook marketing guide, it is important to know how Facebook Graph Search works so that your pages can easily be found by the maximum number of search users. Behind-the-scenes Facebook has created an enormous index of entities and their relationships and overlaid this information with live information about locations, shares, likes and comments as they appear on people’s timelines. A main component includes the use of attribute mapping. This for example would connect all those people who are friends, and live in a specific place. That is a very straightforward mapping. Facebook Graph Search performs complex mapping of many attributes and indexes in this way.
The effort to get to page one includes a myriad of optimizations to content, media, website page structure, linking – the list is very long indeed. Standard optimization using regular and well-known methods includes adding keywords and tags and using social media to popularize the site. In days gone by a whole list of tactics used by “optimizers” brought instant results many were very impressive – page one for any keyword in a week etc. were common claims and still are to be found in the recesses of the Internet. These optimizations either did not last long or, more frequently, caused the site to be hit hard when Google came around to upping the ante by filtering out the spammers and SEO gamers in their more recent updates. Google is very specific about what it is they do and do not like.
We discuss Google mainly because their search engine is where over 80% of the world’s searches are serviced from. If you are not in Google, you are nowhere. Even more pertinent is that you have to be on page one in the search results to get any traffic of note. This is where optimization begins, where website owners start to recover their investments and online business thrives. Lower than page one and you may as well be trading from a beach hut on Easter Island with no Internet.
Come Back for More
Take note that Google wants you to keep coming back for more so they are constantly adding new avenues of opportunity and innovative services. This is part of the reason they stay at number one in the search engine world – they promise to strive to offer value and keep improving on that promise. In the same way, they want you to do the same with your website by offering value to the traffic they bring. Be under no misconception – you don’t have the right to Google traffic; Google is your primary customer that comes before the people who use their engine. You are offering a product to Google – your website. In return, they are serving your product on to their users.
What does Google want from You?
To begin with you need to know what Google does like. Google is now taking into consideration social media, shares, social signals, bookmarking and other social networking interactions that create a measurable trail of activity. Backlinks and the traffic that traverses them are also measurable components that Google uses to choose the top results. Last years’ Panda update paid specific attention to site with an unnatural number or type of backlink. Those with 75,000 dubious backlinks from forums and comments were considered to be “gaming” the system. It transpires that the only valuable backlinks now are from sites with relevant content and those with some authority (such as a PageRank 3 or above). The habit of placing large numbers of keywords in text has also caught their attention as has content spinning and fake reviewing. This leaves little in the armory of the less desirable SEO companies but nevertheless, you will be sold these techniques as valid SEO practices and exploited until you realize your error. In reality, those will get your site banned or sandboxed by big G who has the ability to simply make you disappear never to be seen again. Some site owners hit by the Panda update simply started from scratch rather than try and repair their SEO damage. Welcome to the Easter Island Trading Co.
This is a very important point when looking for SEO companies. The regular changes to the algorithms at Google now means that to get ranked, you are basically going to have to do what they ask:
- Provide quality content
- Only make relevant links
- Update your site frequently
- Become popular in social media as well as simply performing in search
- Optimize your site and content according to the published quality guidelines [link here]
Google is a business. Google wants to serve the best results, with little to no error. Google wants to solve your issue and present you with information that will best service your needs.
Take note that Google wants you to keep coming back for more so they are constantly adding new avenues of opportunity and innovative services. This is part of the reason they stay at number one in the search engine world – they offer value. In the same way, they want you to do the same with your website by offering value to the traffic they bring. Be under no misconception – you don’t have the right to Google traffic; they are your primary customers before the people who use their engine. You are offering a product to Google: your website. In return, they are serving your product to their users.
The only way to make a long-term impact is to cater to the needs of your customer. This means that successful optimization must conform to the framework that Google has laid out in their quality guidelines. Also be clear about different processes – driving traffic and optimizing for search. Many SEO companies will blur the line between these activities.
Google wants Quality
Quality is not simply in the gloss sofa well-honed website landing page. This is quality for the user. Sites should be optimized on-site and off-site in organic ways rather than relying on contrived methods such as paid links or article syndication that simply copies the same article to thousands of low value locations. Quality involves updating the content with useful media and links; maintaining a constant input of backlinks that are from respectable sources and updated regularly.
A single blog posting on that provides unique and engaging content with a single link is far more powerful in terms of SEO than a 1000 backlinks from forums and spam comments – which are now actually damaging for a site. Social media campaigns offer a huge opportunity. SEO providers that can offer these services are relative newcomers. The old tactics of mass linking and paid SEO efforts that are patently of little user value are still touted and sold heavily. Next time you look for an SEO provider, look to see what they profess they can do. Google has got a lot smarter of late – those making money online through SEO whether providers or consumers had better pay attention. Content and quality are the new rulers of the search engines and surprisingly as algorithms become more complex, SEO has seemingly become more simple. The reality is that SEO remains a moving target. Look out for Penguin 2.0 – thats due to be rolled out soon by all accounts.
I recently entered a competition run by Google relating to the much anticipated release of Google Glass, a new technology about to sweep the globe. These are “wearable” computers that deliver information through a small device attached to otherwise fairly ordinary looking glasses.
These special glasses however are replete with some stunning technology that allows information to be fed continuously to the wearer through some ingenious optics positioned over the right eye and sound conducted through your skull. The new technologies in Google Glass are a leap forward for hands-free computing and once fully developed will no doubt make the device a world best-seller.
Being invited to the Google Glass initial trial will almost certainly bring some great opportunities – what exactly Google Glass has in mind for my suggestion I don’t know (the competition was to suggest uses for Google Glass); but they picked me from perhaps half a million entries (plus 7999 other lucky people) and confirmed in my G+ messages that I would be contacted shortly.
No doubt signing an NDA will be step one so this may be as much as I am able to say about Google Glass without their permission! I care little about that – I just can’t wait to find out more as I am a geek at heart. Will they send me a pair? Unlikely as they are set to cost about $1500 but I imagine the originals will keep their value pretty well so as an investment alone it would be worth it.
include bone conduction speakers which like your jawbone already does, transmits sound through your bones to your ears. A video camera point forwards and ingests visual data while wireless communications allow software to interact with that data. This will afford the user a more detailed view of the world, depending on what they have set up their device to do for them. The innovation has only just begun; developing apps and functionality for these totally unique new devices will no doubt be an industry of its own and one which wiill grow fast too. An SEO expert will now have to become a GGO expert (Google Glass Optimizer – yes you heard it here first!)
Google Glass is new, its here and I am ready to try it out – if I can tell you more about it, I will for sure – if I can’t you will have to wait your turn until they appear in stores.
All-in-one printer/scanners should have put paid to the need for photocopiers years ago you would think, but they have stood the test of time and necessity and still have a major role to play in the office. They have also evolved along the way. When I first had an office job they had to be kept in a separate room due to the noxious fumes and the mess that surrounded them every time the toners were changed.
Changing toner was the work of the expert technician back then. Open the doors and you were faced with trays of powder that if spilled could make you look like a coal miner in a moment. I worked as a field service manager for computer companies and toner spillages were a nightmare for the repair shop or anyone who had to deal with them. We had special static free vacuum cleaners and one poor engineer whose job it would be to go clean up any mess left by the toner. The customer paid heavily for such cleanups.
Onsite repair technicians would spot the tell-tale puff of toner around the edges of the lid and openings of a laser printer or copier and would know never to open the unit. Instead they would just take it back to the repair shop and let someone else face potentially having their ethnicity changed for a few days. Now you can change the toner in a copier in a moment and its easy enough for anyone to do – child’s play. Probably best leave it to an adult still though.
Copier/printers are not particularly cheap devices but they cover the needs of many and end up being very affordable when costs are considered. They are expensive because the need to shift large volumes of paper requires hardware that can sustain constant use. Your average printer/scanner would collapse under the strain of printing all day!
Because they are a larger investment it pays to lease a printer / photocopier combination so you get the latest gear, they are looked after by proper technicians and have a fixed monthly cost. You won’t need to buy desktop printers any more either – a wireless printer/photocopier can receive print jobs from anywhere and print them with superb quality at a fraction of the consumables cost.
So back to the initial comment about the all-in-one desktop printer taking over from the photocopier; what really happened was the photcopier turned tables and took over from the desktop. They took over big style too – the features available on the latest copier/printers are stunning to say the least. Who could have imagined that the most hated piece of machinery in the office has become the most-liked. Copier karma?
Article writing is a particular way of presenting information to people online in a popularly acceptable format. Articles contain mostly text but also other media, with a typical length of between 300 and 750 words, usually covering a single topic. When writing an article it is important to begin with an introduction of 100 or so words which will contain the majority of the article keywords. The introduction has a number of functions including setting the tone for the reader, introducing the subject matter you are writing the article about and using article keywords to promote search ranking. The main article keyword should appear within the first 160 characters, because although search engines use the Title and Description meta tags (where, incidentally, the article keywords should appear), some use page excerpts to create search results and commonly look more intensely at the first sentence for index-worthy keywords.
Add some body to the article you are writing
At about this point in you should be fleshing out the information you have introduced above. Unless you can create meta tags, you won’t have so much control over how your article keywords are used. Many do however allow you to add your own meta tags rather than generating them but usually they are generated directly from the post title and the opening paragraph, reinforcing the need to get those in good shape. If you are writing an article that will be published on your site, then you will have full control over your article keywords and meta tags.
When writing an article, there are a number of main components that should be included:
- An attention grabbing title in <H1> or <H2> heading style
- An introduction containing the main keyword in the first 160 characters
- One/more subtitles that contain article keywords in <H2> or <H3> heading style
- A nice bullet list like this to help readers easily assimilate important points
- Some body text, perhaps 200-300 words
- A conclusion to sum up the article
- A Resource Box with links and author information.
Inventive use of article keywords
While the text should cover all the important points in an[article; [writing] [articles,] keyword] selection, and full utilization of the available vocabulary should be primary considerations. Article keywords can span punctuation too and become a multiple keyword string making a short phrase very powerful indeed. In our example in the first line we manage a three keyword string. If you match the colors of the square brackets you will see them.
Last thoughts about article writing
Once you have covered everything in detail a conclusion to the article you are writing should mention the major points made in the body text without repeating the article keywords (or article keyword if you only chose one) more than once. In our example here, we used four and put them all in the last paragraph. Article writing is not rocket science and anyone can get to grips with both the structure and SEO in no time – try it!
Recently I developed a website for a client using one of my own spare domains which I use for these purposes. The intention was to migrate the WordPress site to his domain at the end – a process which I have successfully completed numerous times. Initially the client had his hosting at domain.com where there were a few problems with importing databases and them not being supported by the type of server implementation they had. Having created the site, I was ready to make it live and after the usual fiddling around exporting and reimporting the SQL databases, and doing a search/replace in them to change the domain name instances and altering settings in WordPress I came across two problems. The first took me a little while to figure out and was irritating. Pages would not display and ‘404 Not Found’ errors kept cropping up.
Broken Permalinks After Moving the Domain
What I discovered was that the permalinks in the WordPress (which were a custom structure) had not updated to the new domain name. It took 10 seconds to fix – simply set the permalinks in WordPress to a different setting and saved it, then changed it back again to my custom permalinks and everything picked up the correct domain name. It was an easy fix, just hard to find.
The next problem however was not so easy to sort out. Having repointed the nameservers to my Bluehost account from domain.com. the site was functioning for me after just an hour or so but for my client who was in Canada, it did not. It turned out that domain propagation was causing some odd effects.
Domain Propagation – The Bane of Developer-Client Relationships.
Domain propagation is a strange process. When a site moves from one domain name to another and nameservers are changed, the world has to be informed about the change. The way the Internet works is that when a domain name is registered, the domain itself is initially “in limbo” until a hosting account (on the same server as the domain or elswhere) is acquired and the nameserver (a way to point people to your domain server) made to point to your hosting server.
You can enter a somain into this useful site and check various aspects of DNS propagation:
Confusing? Yes it is a little. What makes it worse is that the site where the domain is registered does not necessarily have to be the host too. For example, if you purchase a domain at GoDaddy, you can “point” the nameservers to another server such as at HostGator where you have purchased a hosting account.
However, when a domain is moved from one server to another, things get a little wonky for a few days and this can look like the developer (or “The Internet”) is causing a problem that is preventing a site from being found. It is hard to explain to a client whaty is going on. What is worse, this problem can cause part of a site to appear mixed in with other parts that are out of date. The best soltuion is to not tell the client until propagation is complete but most in my experience are very much hands-on when it comes to checking out their new site – rightly so as they are paying for it to be built.
How does domain propagation cause issues?
The reason is that to keep the Internet running smoothly, the servers that deliver pages to your browser (i.e. at your ISP) keep a copy (cache) of pages locally to save time when you ask for them again. If you need to refresh a browser page, pressing F5 will refresh it from the nearest cache – on your local machine. If you clear your browser cache and hit F5, the browser will not find any locally cached pages information/images etc., so will request a copy from your ISP’s server. They too keep a page cache – one which you unfortunately cannot clear. The way the ISP cache is refreshed is a mechanism called TTL (Time To Live) which denotes the amount of lifespan a page in the cache will have before the server refreshes and gets a new copy from your host server where your site is stored. In our example that is HostGator. All Internet data has a TTL which is a simple self-destruct mechanism with a ticking clock countdown or a specific time – once the number of seconds in the TTL for a data packet reach zero or the specified time is reached the data is discarded. This is a hard-coded mechanism within the packets that make up the data sent and stored on networks and is not configurable remotely. It is used to maintain privacy and reduce wandering packets (where the destination cannot be reached, the TTL setting will destroy the packet to prevent network congestion).
This chain reaction of refreshing data gets broken when you change nameservers and move domains. What happens is that pages and the nameserver pointers are cached on servers across the world and when a refresh is requested, numerous problems become evident; the wrong page appears, menus may not work and images may be incorrectly displayed or missing – it depends on what has been cached and what is delivered as fresh data.
For a few days this is a pain as clients see a mish-mash of their site which can look a real mess. The TTL setting at most servers is measured in seconds and for a web page can be something in the order of 24 hours (86400 seconds) similarly for the caching of the nameserver which also has a TTL. If the server is in a foreign country, the effective TTL will most likely be in real terms even longer.Setting TTL at your host will make no difference to domain propagation as the caches in the servers along the route refresh. This process is automatic and can take a day or so regionally/nationally – several days internationally.
After the TTL expires for all existing pages and their components, the system gradually refreshes completely. This might take 3-4 days to complete everywhere around the globe. There is no way around the situation really so being patient and letting servers refresh is all you can do – remember to keep hitting CTRL F5 when you do look at a moved or new site so you get the local ISP’s latest copy .
One of the modern-day Internet based technological improvements is the cloud. The cloud in its simplest form is a way to send records and software programs between Internet-connected gadgets and comprises interconnected hardware that collectively make up a single entity – the cloud.
Inside the cloud there are countless cheap or free cloud services or you can use. One of these is a product called Xeround. The makers located a list of 43 of the most amazing free cloud services for app developers and integrated them into a single platform. Some of the offerings are already recognized by users from a wide variety of fields; such as Google Analytics. There is a free cloud service for issue tracking, web analytics, load testing, source code management and many other functions.
Many varying individuals use it for a wide variety of explanations. It is used for personal use to allow easy connection of your material from any gadgets that you or your family owns. For illustration, you could use the cloud to keep a file and connect to it from your tablet, smartphone and notebook computer without the use of a USB.
In business use, it is a wonderful resource for organizations to share content between company employees. It is specifically advantageous for employees at varying working locations who are collaborating on a job. Corporations, who are technology knowledgeable, are implementing this all around the globe. It has thus far proven to substantially enhance productiveness.
Application developers use cloud development to share a multitude of material. It is a great help when they are developing mobile apps for smartphones and tablets, or working on computer software. Most developers have an area of focus, regardless if it is mobile apps or enterprise related e-com applications.
The greatest part is that they are all entirely free cloud services!
To see more that Xeround has to offer go to http://xeround.com.
Google quality standards are higher than ever. In fact Google has declared war on poor quality websites as has been shown by their recent Penguin/Panda updates. They have not been shy in mentioning the fact that this year is the year for Google to weed out the crud from the SERPs (Search Engine Results Pages).
A lot of my business is writing and editing and I see dreadful stuff being published online and on the front pages of Google. This make my own writing less worthy when I see it rank below those sites even though I know my quality outstrips theirs by a long shot. Some sites of my clients that suffered in the updates were genuine sites with great products. They did however have many poor incoming links – a specific target for the Google SERPs laundromat or algorithm change as they are known..
The problem for many has been purchased SEO. That’s pretty much it in a nutshell – the wonglers (Those who cannot write proper English, or spin content to the indecipherable) are being thrown off the cliff and generally, the wranglers are left. There is still plenty of work to do and SEO professionals like myself now have to figure out how to regain ranking for broken sites without recourse to the usual link-building which now seems like a dying art in its present form. Another effect of the updates where a site was punished for poor links will be a downgraded PageRank. If your PageRank has dropped recently, the updates are most probably to blame. PageRank drops also may not be immediately noticeable as they are only updated every 3 months. Find out more about dropped PageRank here.
I did notice however that sites with blogs seemed to have done pretty well in the light of the Google updates so I will be focusing more on that in the coming months. For those who see no light at the end of the tunnel you have my condolences for the loss of your ranking and hope I can be of assistance in clawing your way back up. The Google reconsideration request tool combined with the new Google Disavow Links tool is the first step for many.
If you need SEO experts to help you get back in shape in the search engines, contact PhraseSet today and we can get your business on the road to recovery.
By Tim Higgins
Multiple Domains can be a problem for ranking in Google
Multiple domains/sites are not popular with Google and unless they have clearly different objectives and products, they will be canned in the SERPs (Search Engine Results Pages).
Multiple domains or sites selling the same product lines, overlapping lines, syndicated/copied product content – bad idea – Google hates them and no matter how hard you try and cloak your ownership, they can usually find out who you are because typically these sites have one or more:
- Same telephone numbers for customers to call – easy to cross check
- Interlinked sites – many multiple site owners seem to think that linking their sites together makes a difference – it does not and Google have said this before.
- Same IP address/host for all the domains- that’s the easiest thing for them to check and very cumbersome to get around – taking out 20 hosting accounts is no fun and too unwieldy anyway.
- Whois details – even with privacy, there are some details that can be extracted from the registrars that can show them multiple domain connections.
If anyone thinks they can get around all that then consider this. When you bought your multiple domains, did you get them all at the same vendor on the same day? Did you use a credit card to buy the domains with? Did you fail to change the registration to private before you bought the domain? (otherwise there is a whois record with your name on it). All these things are trackable by Google, including the history of your web usage. If you login to the wordpress of all your domains from a single ip, the analytics function can allow Google to know this. Your webmaster tools is another place where they can see multiple domains grouped together that are owned by a single person/entity.
That is also indexed by Google. There is a long trail left behind for Internet activity and Google handles a lot of it; nearly every action we take Google monitors in one way or another. Welcome to Big Brother. So if you think that you cannot be touched – think again – the nature of information collection by Google is intrinsically covert. What they do with that data is largely unknown. They definitely manipulate it to weed out the unscrupulous and multiple sites are frowned upon heavily these days.
So you see there is no escaping the insidious nature of Internet information tracking and in general we have given Google permission to do so – simply agreeing to the terms of Google Analytics does that.
My advice to whose with multiple sites where there are overlapping or identical product lines is to begin reducing the effort put into least successful sites and increasing the successful ones growing them in size. Running backlink campaigns these days is a waste of effort – the money and time is best invested in broadening the scope of your content, increasing your pagecount, blogging at least twice a week and improving the user experience with more functionality such as videos etc. The next thing is to run proper social campaigns because thats where its at – for sales, thats where the consumers hang out. Facebook is now the worlds largest ‘storeless mall’ – People hang out there all day waiting for a juicy tidbit or offer and pounce in their thousands when they see something they want, like pedestrians walking through a mall with no stores in it.
The answer to those having ranking problems with multiple domains is to
- diversify your product ranges
- make the most of social media – Facebook, Stumbleupon, Twitter etc.
- do not use duplicate/syndicated content
- do not have heavily overlapping product lines
- do not link sites unnaturally together. i.e. do not link your pinkhandbags.com site to your petfood.com site.
- READ THE GOOGLE WEBMASTER GUIDELINES!
- READ THE BEST PRACTICES FOR RUNNING MULTIPLE SITES
- Use Google’s Reconsideration Request tool and Disavow Links tool
If you have problems reported in your webmaster tools you can find out about cleansing your domains here.
Be aware 2012 is an important year for Google -these updates are signalling the definite end of an SEO era. Pity those making a living selling backlinks because they are now an obsolete industry – they should all have seen it coming however. I am a writer and get to see the awful quality of many sites I am hired to rewrite – I can only imagine business will improve for us in that respect as webmasters everywhere realize the value of offering a quality website experience for users. Take note – one thing is for certain: Google does.
If you need help with reorganizing your domains, talk to PhraseSet today – we can help make some sense out of the SERPs madness for you!
This week I have been working with clients on the results of the latest Google updates. The Panda and Penguin updates have been a constant reminder this year of the fact that it is Google that owns the search engine and they are not shy in modifying it to meet their long term objective which has been stated as a desire to only rank quality sites. In addition there is a requirement not to use optimization tactics that are considered “black hat” SEO – these might include keyword stuffing, link farming and other “forbidden” maneuvers.
The list of activities that are actively punished by Google gets longer each year; this year however has been a comparatively tumultuous one for SEO. The first thing the big G did was to go on a witch-hunt to root out poor quality sites. This update removed a large number of sites from the search results – in the millions – and took place in the 1st half of the year. Following this update was another far-reaching one which de-ranked or de-indexed millions of pages which were over-optimized pages backlinks from dubious sources and anchor texts that were keyword stuffed and originating from non-relevant sites. This update, named Panda for some reason, affected other sites too; blog pages of SEO companies abound with harrowing tales of website owners whose business(es) just went down the pan. Sadly some of those businesses were pretty good quality sites by all accounts but unfortunately the algorithms caught some innocents too and punished their site for no reason, replacing them with undoubtedly poorer quality sites. Whether time will allow them recovery or not is yet to be revealed.
There is one consolation factor – there is the Google Reconsideration Request procedure recently defined so that site owners can ask for a review. This is not all it seems however so you should definitely read the information before you pursue this.
The Panda and Penguin updates by Google have left behind a trail of consternation and in many cases misery and tears as devastated business owners try to pick up the pieces. Google has as usual demonstrated little sympathy and apart from the new ‘disavow backlinks” procedure they allow site owners to engage in to remove spam backlinks have done nothing other than issue some vague statements about what they deem is good quality and what is not. They do publish website quality guidelines for those who can be bothered to read them. As an SEO professional navigating the SEO minefield is now a more difficult task than ever. These updates were not a single application of a new algorithm either – Panda is now on its 20th revision. My webmaster tools graphs look like a cross section of the Alps now, where they were previously very much either a steady growth curve or a plateau. One site I am looking at this week has dropped from 40,000 impressions to under 4000 – I will no doubt be asked to look at others over the coming weeks. Google states that it may take some sites that are hit a “few weeks” to recover from the algo change – Many site owners will be waiting with bated breath…
The last update in amongst all this was an EMD update. This gave favorable ranking to sites with Exact Match Domains but as with all these updates, for every winner there is a loser. Domains that previously ranked well for a keyword have disappeared in favor of sites which have little on them but have the exact match domain name.
One thing is certain, the world of search is changing again. Gone are opportunities to rank well by purchasing backlinks and other similar short-cut activities. What Google wants is quality, which means sites written by native English copywriters, SEO that is natural and not forced and above all the best user experience. Niche site building is probably now a thing of the past as small sites are not ranking anymore; niche marketers are already quite vocal about the loss of business – one I know of had 200 ranking domains – this year 120 of them disappeared from Google. The remaining sites are larger so escaped the wrath of big G.
If your sites are suddenly not performing it could well be that you have been hit by yet another update from Google – maybe even hit by multiple updates which is very confusing when it comes to trying to unravel the issue for a particular site. Be aware though that any SEO you apply had better be either very inventive or strictly within the Google guidelines or watch your domains become invisible.
If you have been hit by Google’s Panda, Penguin or EMD updates then you may need some help to restore your site back to its previous position. Some sites are irredeemable as they are not worth the expense in trying to get them to rank again; in these cases it is more prudent to create a new site and start again. Regardless of the damage, one de-ranked, it is a hard climb back up to the top. If you are de-indexed as well, experience has shown you may as well start over…
In academic terms a G+ would be an awful score – like “below F” (Fail). How you can be less successful than failure is undetermined which is why our grading system happily ends before this notional G+ rating. When I was young we did school examinations at age 16 called O-Levels. (I achieved 7 for the record which was not a ground breaking number by any means). The grading system ran from A to C as passes, D & E were considered redeemable failures. “UN” was a resolute FAIL. This last grade was perceptibly worse than a G+. UN signified “ungraded,” a distinctly demeaning result for an examination. It basically indicated that the pupil had no knowledge of the subject worthy of an examination mark. I only got this in 2 subjects; the first was the ‘History of Music’ – which was surprising to the music teacher as in practical music, I had a Grade II in piano and Grade V in notation. These grades required knowledge of actual music and not history, a subject at which I was a virtual amnesiac. I have since been playing instruments (the guitar mainly) for 28 years now but still would be likely to score an UNgraded mark in a Music History exam.
My other epic failure was in Chemistry, despite the valiant efforts of a relatively good teacher in the subject at the time. I now understand chemistry very well (I freelance for a genetic scientist so I have to know a fair bit!) so in contrast to the music history I could definitely get a decent pass in that chemistry exam now.
Back in the day when I did not however, G+ as a grade would have definitely looked better on my report card than UN. I believe the letter F was not used at all as they did not want to label kids as “F”ailures. UNgraded is so much more dignified; like perhaps they just didn’t have time to mark the exam paper.
G+ these days denotes success on a vast scale. G+ (Google Plus) is now the fastest growing social network ever. With the growing attraction of little interference from advertising and a much simpler interface than Facebook, G+ certainly has my thumbs-up. I am sick of the intervention that is inherent to Facebook’s insidious advertising, permission giving and general popup-driven frenzy of activity that just makes my online life so much more complicated. I put up with it to stay in touch with a few people that’s about it. I am tired of having to go through Facebook settings and restore order, tired of telling Facebook I don’t want to play Farmville or the thousand other irritating games they offer. These games exist merely to get people to buy “coins” to allow them to progress in these games faster or have more resources when they play. The current Facebook model for me as a user is ironically an epic “F”ail.
Not that I use G+ a great deal either other than for promotional purposes. Having G+ “likes” on your page helps a lot in search I notice. Advertising on Facebook does have a wide reach however and from a marketing perspective, it is more useful than Yahoo/Bing when it comes to generating PPC actions. AdWords is still the only way to appear in Google via PPC.
So is Google the ‘Augmented G’ it purports to be? Its hard to tell because their advertising strategy for G+ is not yet mapped out so marketers like us have to wait and see if G+ is just another admin panel to maintain for advertising to add to AdWords, Yahoo/Bing and Facebook PPC. Campaign management already requires the monitoring of a number of admin panels that display information in different ways. Unless it is integrated into AdWords, adding a G+ PPC admin panel will add a 30% overhead to the campaigns already being run for an average client. Also getting clients to advertise in all of these places is already difficult to sell and will be too expensive for many; adding more expenditure is never a popular move unless you can guarantee results…G+ results are unfortunately not guaranteed yet so maybe that is why they are waiting to roll out their advertising model.Maybe they have a new advertising model in mind at Google? Who knows.
What is for sure is that Google will have to be careful because a G+ can easily turn into an F. Like an exam, the result depends on those who are marking your paper – in Google’s case it is us and just because they are good at one area of expertise, does not automatically make them good at others. Time will tell.
My taxes are uncomplicated so I don’t really worry too much – I use a tax preparation software and it takes me usually about half a day. Recent SEO and content/site production at silvertaxgroup.com, 911backtaxhelp.com and backtaxadvice.com led me to believe that there are many others however who do not have the easiest path when it comes to preparing their tax returns. What is evident, and it has already played out the same way in my life many years ago, is that the IRS are unsympathetically relentless in their pursuit of back taxes.
They powers that the IRS have are immense. They can levy your bank accounts, stick a wage garnishment on your paycheck and generally relieve you of every asset you have until your tax debt is paid. The “forcible” collection process is all easy for the IRS to instigate because they have huge legal jurisdiction over your assets when it comes to back taxes being owed.
If you owe back taxes it transpires, the time to act is right now because any minute, the IRS notices piling up on your doormat will be converted in to collection actions by the IRS. If you owe a lot of back taxes you need a tax attorney because you are already in the crosshairs of the IRS and they will target you as soon as they legally can. If like me you can get yours in at 10p.m. on the 15th April, you are probably very lucky indeed. Without some software however, I would be starting in March …or December maybe?
The Weekly Blog is the Mainstay of Many a Non-competing Website.
For those of us trying to maintain a decent web presence the weekly blog has become something of a necessity in order to maintain a presence and take a keyword opportunity. The weekly blog for me personally is a chance to express myself in words without having to conform to some business notion or new SEO idea; I write about that sort of thing enough during the day for my clients. The need for SEO services is so great in fact that not only am I often too busy to write my weekly blog, I have spent considerable proportion of my time writing about the topic for ebooks and a myriad of websites all performing SEO in some form or another. Curiously, I have never really paid much attention to the SEO on my own site. I rank highly for a number of disparate keywords as the varied topics of my weekly blog create a relatively dysfunctional SERPs presence. Most of my work is gained through oDesk where my reputation and feedback is a touch ahead of most competing contractors so I am rarely short of work. I could easily focus my mind on the SEO for my site and up the ante with my weekly blog in order to demonstrate keyword prowess as is the wont of a number of my competitors. The pie is so large that I worry little about competition; I focus entirely on maintaining the quality of my work. My weekly blog is more a necessity in the world of SEO; keywords for which I have no use for but a pagerank that must be maintained and nurtured. So I will soldier on with tricky keywords in the real world of SEO, AdWords campaigns and intensive efforts for difficult but lucrative keywords. Tomorrow morning I will be back to the grindstone making other sites popular. As for mine? Apparently a weekly blog is enough.