Tuesday, March 10, 2015

Why So Many Geeks Hate Internet Explorer

It’s common knowledge that almost every single geek hates Internet Explorer with a passion, but have you ever wondered why? Let’s take a fair look at the history and where it all began… for posterity, if nothing else.
Contrary to what you might think, this article is not meant to be a hate-fest on Internet Explorer—in fact, since IE 9, they have continued to improve the performance, add new features, and generally make it standards-compliant.


In the Beginning There Was IE, and It Was Good?

We’ve all been so used to thinking of Internet Explorer as that slow, buggy browser that is behind the times, but it wasn’t always that way—in fact, way back when, Internet Explorer pioneered many innovations that made the web what it is today.

Here’s a quick tour through the easily forgotten history of the infamous browser:

1996: Internet Explorer 3
This version of the browser, introduced in 1997, was the first browser to implement CSS (Cascading Style Sheets). Yes, you’re reading that correctly—in fact, it introduced many new features like Java applets and sadly, ActiveX controls.

1997: Internet Explorer 4
IE4 introduced a blazing fast (at the time) rendering engine as an embeddable component that could be used in other applications—this was a lot more important than people realize. This version also introduced Dynamic HTML, which allows web pages to dynamically change the page using JavaScript, and added Active Desktop integration.

Even more weird? Seems like nobody remembers this anymore, but IE4 was actually cross-platform—you could install it on Mac OS, Solaris, and HP-UX—and by the time IE5 was released, IE4 had reached a 60 percent market share.

1999: Internet Explorer 5.x
Microsoft invented Ajax. Wait… what? That’s right, it was this version of IE that introduced the XMLHttpRequest feature in JavaScript, which forms the underlying technology behind every web application you’re using today—you know, like Gmail. Of course, the term “Ajax” wasn’t actually coined until years later by somebody other than Microsoft, but this release supported everything required to make it work.

So Yes, Microsoft Innovated
From IE3 until IE6, Microsoft used all their resources to simply out-innovate the competition, releasing new features and better browsers faster than Netscape. In fact, Netscape 3 Gold was a buggy piece of junk that crashed all the time, and Netscape 4 was extremely slow and could barely render tables—much less CSS, which would often cause the browser to crash.
To put it in context: web developers used to complain about Netscape the same way they complain about IE6 now.


What Made It Go So Very Wrong?

The trouble all started when Microsoft integrated IE into Windows as a required component, and made it difficult to uninstall and use an alternate browser. Then there was the whole business with them exploiting their monopoly to try and push Netscape out of the market, and a lot of people started to view Microsoft as the evil empire.

Microsoft Stopped Trying
By the time Microsoft released Internet Explorer 6 in 2001, complete with lots of new features for web developers, since there was no competition and they had a 95 percent market share, Microsoft just stopped trying—seriously, they did nothing for five years even after Firefox was released, and geeks started migrating left and right.

Microsoft-Specific Features
The whole problem with Microsoft’s innovation is that much of it was done in ways that didn’t follow the web standards—this wasn’t as big of a problem when Internet Explorer was the only game in town, but once Firefox and Webkit came around and started following the standards correctly, suddenly it became a huge problem for web developers.

Security Holes and Crashing
Since Microsoft decided they didn’t need to try anymore, and they didn’t keep up with the competition from Firefox and other browsers, bugs and security holes just cropped up left and right—really terrible ones, too. For instance, this code is all that is required to crash IE6:
In fact, the screenshot at the beginning of this section was a live example of testing out this particular bug.

IE7 and IE8 Were Too Little, Too Late
It took five years after IE6 for Microsoft to finally get around to releasing IE7, which added tabs and made the browser slightly more tolerable, but for web designers it was still a nightmare to deal with, and only complicated the issue since now you had to make pages render correctly in two lousy browsers instead of just one.

It took another 2.5 years for Microsoft to finally release Internet Explorer 8, which greatly improved CSS support for web developers, and added new features like Private browsing, tab isolation to prevent one bad page from taking down the whole browser, and phishing protection. By this point, most geeks had already moved on to Firefox, and then some of us to Google Chrome.


Here’s the Real Reason Geeks Hate IE

Just because we’re geeks doesn’t mean we hate everything that’s inferior and outdated—in fact, we often love retro computing—that’s why we love Atari, NES, Commodore 64, etc. We take pride in our geek knowledge. So why’s Internet Explorer a different story?
Here are a couple of reasons that fueled our hatred of the buggy browser, and finally put us all over the edge:

Supporting IE is Like a Fork in the Eye for Web Devs
Here’s a sample of a day in the life of a web designer: You spend hours making sure that your page looks great, and you test it out in Google Chrome, Firefox, Safari, and even Opera. It looks great, awesome!

Now you open up IE and the page looks like somebody put it into a blender and hit the Whip button. Then you spend double the amount of time trying to fix it to look tolerable in IE6 and IE7, cursing loudly the entire time.

Luckily by 2014, Internet Explorer 6 and 7 are a statistical anomaly in actual Internet usage, and most of the bigger websites have completely stopped supporting them. Even Internet Explorer 8 usage has dropped to single-digit percentages for many websites.

Geeks Being Forced to Use Internet Explorerimage
And here’s where we come to the real issue—the whole reason that geeks can’t stand Internet Explorer:
Geeks everywhere were forced to use Internet Explorer at work even when there are better browsers, forced to support it for corporate applications, forced to make sure web sites still work in IE, and we couldn’t convince everybody to switch to a better browser.
Geeks don’t hate something that’s inferior—but they do hate it when it’s forced on them.


The Good News: The Future Is Brighter for IE

Thankfully, it seems like Microsoft has finally learned from their many, many mistakes in the browser world. Internet Explorer 10 and 11 are blazing fast, mostly standards-compliant, and other than the outdated UI that really needs some love, are a solid choice for anybody. There are even rumors that Microsoft might finally release a better user interface for IE in Windows 10. Here’s hoping!
In fact, based on our recent testing, a lot of the new malware isn’t even targeting Internet Explorer anymore, because writing plugins for IE is a complicated thing, whereas writing some quick HTML and JavaScript code to make spying adware extensions for Firefox or Chrome is really easy.
It’s a whole new world, and Chrome, rather than IE, is the target.

originally published by How To Geek: 
Lowell Heddings, better known online as the How-To Geek, spends all his free time bringing you fresh geekery on a daily basis. You can follow him on if you'd like.

Wednesday, November 5, 2014

Google Bot Evolution

Google never, ever does something for no reason. Sometimes it’s just a matter of waiting patiently to figure out what that reason is.

In May, Google created a fetch and render tool in Google Webmaster Tools that was built to render web pages properly for GoogleBot. At the time, it was unclear why the company was introducing the tool, though it hinted at future plans that would involve fetch and render.

On Oct. 27, we got a definitive answer.

That fetch and render tool foreshadowed the introduction of new guidelines that say you could be negatively impacted on search rankings and indexing when you block your CSS or JavaScript files from being crawled. When you allow Googlebot to access these things, as well as your image files, it will read your pages correctly. When you don’t, you could hurt the way the algorithms render your content and thus result in your page rankings declining.

So that tool that was put out a few months earlier was basically a warmup – it can be used to make sure GoogleBot is rendering your web pages correctly.

It’s all part of a drive toward better user experience that is ultimately behind the changes Google has made.

The Nitty-Gritty of the Changes

Google says the change was basically to make its indexing system more like a modern browser, which have CSS and JavaScript turned on. So, as always, Google’s claim is that it’s doing this for the greater good. It wants to make sure it’s reading things just like the people who will be looking for your content.

That’s a big change from before, when Google’s indexing systems were more like text-only browsers. Google cites the example of Lynx. But the search engine says that approach no longer made sense since modern browsers index based on page rendering.

The search engine offers a few suggestions for optimal indexing, including:
•Getting rid of unnecessary downloads
•Merging your CSS and Javascript files
•Using the progressive enhancement guidelines in your web design

What This Means

With any Google change, the real question is what does this mean? How will it impact webmasters and what sort of impact could it have on SEO?

Clearly the answer to that second question is sites that do not adhere to the suggested guidelines will see their search results suffer. Make sure your webmaster fully understands what Google is asking for, and discuss what type of changes should be implemented and how they could affect Google rankings.

Your aim is to create crawlable content, and that means doing whatever Google suggests. Use the fetch and render tool to make sure everything on your site is in order. It will crawl and display your site just as it would come up in your target audience’s browsers.

The tool will gather all your resources: CSS files, Javascript files, pictures. Then it runs the code to render your page’s layout in an image. Once that has come up, you can do some detective work. Is Googlebot seeing the page in the same way it is rendered on your browser?

If yes, you are in good shape. If no, you need to figure out what tweaks to make so that Google is seeing the same thing you are. Here are potential problems that could be making your site’s content non-crawlable:

* Your website is blocking Javascript or CSS

* Your server can’t handle the number of crawl requests you receive

* Your Javascript is removing content from your pages

* Your Javascript is too complex and is stopping the pages from rendering correctly

Why These Changes, Why Now

Google always has intent behind what it does, and here’s my read on its intent with these changes: It’s making user experience a bigger factor in its search rankings. Think about it. The emphasis on page loads and rendering are two major steps in that direction.

That has also prompted speculation that the company could start using mobile user experience for its rankings as well. There has been rampant speculation in recent months, as mobile usage begins to overtake desktop, that Google will begin shifting its focus to the mobile web for search engine optimization.

So could this be one of the first steps on the way to those big changes? Perhaps. I always think it’s dangerous to try to get too many steps ahead of Google; the search engine likes to reverse course and throw people off from time to time. It does not like it when SEOs make changes in anticipation of its actions, preferring to dictate the course itself. And I do think the idea behind the crawlable-non-crawlable content changes makes sense. You have to keep up with the times.

But others could argue that keeping up with the times is exactly what Google will be doing by putting greater emphasis on mobile user experience.

The Bottom Line

Like any change from Google, this one will require adjustment and a fair bit of vigilance. I think it’s mostly a sign of things to come. User experience is really important to Google these days, and you would be wise to start looking at your mobile site in those terms. Make sure that you are doing everything youcan to make your site mobile friendly, while still presenting a great desktopexperience.

That way if Google does actually start penalizing based on poor mobile user experience, you will already be two steps ahead.

By Adrienne Erin 

Saturday, October 25, 2014

Son of a Breach! Can Companies Just Safeguard Their Customers’ Data?

Just when consumers were starting to regain some company trust and safe-shopping stability after last year’s massive Target breach, a string of new large-scale company breaches quickly reminded us consumers just how insecure our personal data can be.

Needless to say, it’s been a rough year for some major companies and an even rougher year for thousands of unlucky customers. Let’s look at three of the major breaches of the last couple of months.

Home Depot (Source: Krebs On Security)

Early last month, reports started coming in that the home improvement giant was investigating “some unusual activity with regards to its customer data.” Security reporter Brian Krebs immediately called credit card breach, especially since multiple banks came out to say that they were seeing evidence that Home Depot was the likely source of a batch of stolen credit and debit cards that went on sale in the cybercrime black market that morning.

Sure enough, six days later, the company admitted that its payment systems were in fact breached and that the hack was going on for months. They went on to say that while credit card data was exposed, personal pins were not. Reassurance (not really). And while the exact number of affected cards wasn’t known at that time, one thing was for certain: If you used a credit card at one of Home Depot’s U.S or Canadian stores in the past 4-5 months, you needed to consider your credit card stolen and get on the phone with your bank ASAP.

About two weeks later (September 18th), Home Depot announced the number. A whopping 56 million cards were impacted, making the incident the biggest retail card breach…ever (on record, at least). The ‘silver lining’? Home Depot also said that the malware was now contained.

Japan Airlines (Source: Google Images)

Before the month of September passed (and with Home Depot still fresh on everyone’s minds), another large company from a completely different industry had some bad news to share with its customers…

On September 30th, Japan Airlines (JAL) confirmed that as many as 750,000 JAL Mileage Bank (JMB) frequent flyer club members’ personal info was at risk thanks to a breach. Apparently, hackers were able to get into JAL’s ‘Customer Information Management System’ by installing malware onto computers that had access to the system. The data that was accessed? Everything from names to home addresses to birth dates to JMB member numbers and enrollment dates. The good news is that credit card numbers and passwords did not appear to be exposed.

There have not been any new developments about this breach, since the official statement by JAL on September 29th.

JP Morgan (Source: Reuters)

October 2014 was only two days young when yet another major company confirmed a data breach. This time, the victim was JP Morgan. Or rather, JP Morgan customers who used Chase.com and J.P. Morgan Online websites, as well as the Chase and JP Morgan mobile apps.

Last Thursday, the nation’s largest bank revealed that a mid-August cyberattack exposed personal info for 76 million households, as well as 7 million small businesses. More specifically, names, email addresses, phone numbers and addresses were stolen, while JP Morgan went on to say that there was no evidence that account numbers, passwords, Social Security numbers or birth dates were exposed. While the bank found out about the breach of it’s servers in August, it has since been determined that it began as early as June.

Unfortunately, not much else is certain at this time. What we do know is that Russian hackers are suspected (still not confirmed), over 90 of JP Morgan’s servers were affected, and it is believed that nine other financial institutions were also targeted (although we don’t know their identities). The lack of concrete information is scary in it’s own right, but the fact that JP Morgan is staying mum on the matter is even more troubling. According to a Huffington Post report from earlier today, the bank is refusing to say how many people were actually hit by the breach, with spokeswoman Trish Wexler saying that JP Morgan isn’t offering more details beyond what was announced last Thursday. This could mean that the breach, already the largest (against a bank) in history, could potentially be even larger than the reported 76 million households and 7 million small businesses, keeping in mind that ‘households’ is not the same thing as ‘individuals’.

Additionally, Fox Business is reporting that the bank is now bracing for a massive-scale spear-phishing campaign in the wake of the breach. according to their sources. Considering that no bank info was compromised in the original breach (JP Morgan said in a statement that they haven’t “seen unusual fraud activity related to the incident”), this is a plausible next-step. Using the personal info obtained in the ‘first wave’, the attackers can send out legitimate-looking emails to the affected customers that say there is a problem with the user’s account and ask for Social Security numbers, passwords, etc. Alternatively, the emails could ask the customer to click an embedded link to update their account info, but in reality, the customer is taken to a official-looking fake site from which the attackers can nab the important financial information. In either case, the virtual trap is activated at that point.

What to do?

It’s no secret that data breaches are on a steep rise. According to a the Identity Theft Research center, there have been 579 data breaches this year, 27.5% more than there were at this time last year. And that number is only going to continue to increase.

In any of these three breaches, it’s important for customers to take basic security steps to ensure their information is safe, whether that means calling your bank and getting a new credit card issued (in the case of Home Depot), changing your password if you’re a JAL frequent flyer and JMB club member, or changing your log-in information and monitoring your online accounts if you bank with JP Morgan or Chase.

As more and more people choose to bank online (and become more internet-dependent in general), it’s also no secret that employing powerful and always up-to-date internet security on your devices is more crucial than ever before. Company breaches and spear-phishing attacks aren’t going anywhere. Take the necessary steps to keep your personal info protected!

October 6th, 2014 by Yegor Piatnitski

Thursday, September 25, 2014

Marketing 101

What Is Direct Response Marketing?

There are two major types of marketing strategies. The first is known as mass marketing or “branding”.
The goal of this type of advertising is to remind customers and prospects about your brand as well as the products and services you offer.
The idea is that the more times you run ads from your brand, the more likely people are to have this brand at the top of their consciousness when they go to make a purchasing decision.
If you’ve seen the ads from major brands such as Coca Cola, Nike and Apple you’ll have experienced “image” marketing.
The vast majority of advertising falls into this category.
There’s no doubt that this type of marketing is effective, however it is very expensive to successfully pull off and takes a lot of time.
It requires you to saturate various types of advertising media e.g. TV, print, radio, Internet etc. on a very regular basis and over an extended period of time.
The expense and time involved are not a problem for the major brands as they have massive advertising budgets and product lines are planned years in advance.
However, a problem arises when small businesses try to imitate the big brands at this type of marketing.
The few times they run their ads is like a drop in the ocean. It’s nowhere near enough to reach the consciousness of their target market who are bombarded with thousands of marketing messages each day.
So they get drowned out and see little or no return for their investment.
Another advertising victim bites the dust.
It’s not that the small businesses aren’t good “branding” or mass media ads. It’s that they simply don’t have the budget to run their ads in sufficient volume to make them effective.
Unless you have millions of dollars in your marketing budget, you have a very high probability of failure with this type of marketing.

Direct Response Marketing

The second type of marketing strategy is called “direct response”.
Direct response marketing, is designed to evoke an immediate response and compel prospects to take some specific action, such as opting in to your email list, picking up the phone and calling for more information, placing an order or being directed to a web page.
So what makes a direct response ad? Here are some of the main characteristics:
It’s trackable. That is, when someone responds, you know which ad and which media was responsible for generating the response. This is in direct contrast to mass media or “brand” marketing – no one will ever know what ad compelled you to buy that can of Coke, heck you may not even know yourself.
It’s measurable. Since you know which ads are being responded to and how many sales you’ve received from each one, you can measure exactly how effective each ad is. You then drop or change ads that are not giving you a return on investment.
It uses compelling headlines and sales copy. Direct response marketing has a compelling message of strong interest to your chosen prospects. It uses attention grabbing headlines with strong sales copy that is “salesmanship in print”. Often the ad looks more like editorial than an ad (hence making it at least three times more likely to get read).
It targets a specific audience or niche. Prospects within specific verticals, geographic zones or niche markets are targeted. The ad aims to appeal to a narrow target market.
It makes a specific offer. Usually the ad makes a specific value-packed offer. Often the aim is not necessarily to sell anything from the ad but to simply get the prospect to take the next action, such as requesting a free report.
The offer focuses on the prospect rather than on the advertiser and talks about the prospect’s interests, desires, fears and frustrations.
By contrast mass media or “brand” marketing has a broad, one size fits all marketing message and is focused on the advertiser.
It demands a response. Direct response advertising (pay per click advertising) has a “call to action”, compelling the prospect to do something specific. It also includes a means of response and “capture” of these responses.
Interested, high probability prospects have easy ways to respond such as a regular phone number, a free recorded message line, a web site, a fax back form, a reply card or coupons.
When the prospect responds, as much of the person’s contact information as possible is captured so that they can be contacted beyond the initial response.
Multi-step, short term follow-up. In exchange for capturing the prospect’s details, valuable education and information on the prospect’s problem is offered. The information should carry with it a second “irresistible offer” – tied to whatever next step you want to prospect to take, such as calling to schedule an appointment or coming into the showroom or store. Then a series of follow-up “touches” via different media such as mail, e-mail, fax, phone are made. Often there is a time or quantity limit on the offer.
Maintenance follow-up of unconverted leads. People who do not respond within the short term follow-up period may have many reasons for not “maturing” into buyers immediately. There is value in this bank of slow-to-mature prospects. They should continue hearing from you once to several times a month.

Money At A Discount

Direct response marketing is a highly ethical way of selling. It’s focused on the specific problems of the prospect and aims to solve these problems with education and specific solutions.
It is also the only real way for a small business to affordably reach the consciousness of a prospect.
Your marketing system must deliver profitable results.
You have to know what a customer is worth to you, and then decide what you are reasonably willing to invest to acquire one, and then build systems that work within that limit.
Direct response is an accountable way to run marketing for a small business, as it is highly focused on return on investment.
If $10 bills were being sold for $2 each, how many would you buy?
The name of the game with direct response marketing is ‘money at a discount’ e.g. $2 into advertising to get $10 out in the way of profits from sales.
When you turn your ads into direct response ads, they become lead generating tools rather than just name recognition tools.

Connect with Allan Dib on Google+ 
- See more at: http://successwise.com/what-is-direct-response-marketing#sthash.vXnWVWji.dpuf

Tuesday, September 16, 2014

Pay To Play: The End of FREE Social Media Marketing?

Facebook is at the vanguard of squeezing increased value from paid social media marketing (SMM) – and other networks are following

Twitter has hinted it might create a newsfeed-style algorithm.
Twitter’s headquarters in San Francisco, California
Photograph: Justin Sullivan/Getty Images

Marketers have been complaining for some time that organic reach no longer exists on social media and that brands have to pay for adverts if they want to engage with their target audience.
These rumblings have been growing louder over the past 12 months, particularly in regards to Facebook, which some marketers claim is now little more than a glorified advertising network. However, it could be that Twitter will also become a so-called ‘pay-to-play’ network for marketers, as it recently hinted at plans to implement a news feed-style algorithm.
This means users will no longer see tweets in one continuous real-time feed, but will be shown the content that Twitter deems to be the most important or relevant. Apparently the aim is to improve the user experience, but a cynical might suggest Twitter also wants to squeeze more ad revenue from brands by restricting their organic reach.
To give some context to this debate, it’s worth looking at the evidence that’s stacking up against Facebook.

Dwindling organic reach on Facebook

Much of the disquiet among marketers has been fueled by Forrester research published in October 2013, which began with the bold statement that “Facebook is failing marketers.” Based on a survey of 395 marketers, Forrester found that Facebook creates less business value than any other digital marketing opportunity.
Chart: Forrester Research highlights dissatisfaction with Facebook marketing
Chart: Forrester Research highlights dissatisfaction with Facebook marketing. Photograph: Forrester Research
Forrester’s report analyst, Nate Elliot, claimed this dissatisfaction was due to poor ad targeting capabilities that failed to properly utilise social data, and a perceived failure to deliver organic reach.
“Everyone who clicks the like button on a brand’s Facebook page volunteers to receive that brand’s messages – but on average, you only show each brand’s posts to 16% of its fans,” he wrote.
Separate data published by Ogilvy in March this year showed that organic reach on brand pages had plummeted to just 6%, a sharp fall from 12% in October 2013. The situation is even bleaker for pages with fewer than 500 fans, as they saw organic reach fall from an already low 4% to just 2.1%. Ogilvy also quoted anonymous “Facebook sources” as saying that it wouldn’t be long before organic reach hit zero.
Ogilvy needn’t have relied on unnamed sources, as Facebook has been relatively open about its desire to restrict organic reach. In a sales deck distributed to ad partners at the end of 2013, Facebook stated: “We expect organic distribution of an individual page’s posts to gradually decline over time as we continually work to make sure people have a meaningful experience on the site.”
Chart: Facebook organic reach decreases
Chart: Facebook organic reach decreases. Photograph: Social@Ogilvy

Organic reach is being crowded out

In essence, brands are being crowded out of social media platforms as content from publishers and people’s friends is given priority in the news feed. The document went on to suggest that Facebook fans were no longer a way to gain free exposure, but were instead a way of making ads more powerful as they provide greater social context.
The Facebook report continued: “Your brand can fully benefit from having fans when most of your ads show social context, which increases advertising effectiveness and efficiency.”
Social context in Facebook ads
Social context in Facebook ads. Photograph: Facebook

Is the future ‘pay-to-play’?

It does seem that it’s becoming increasingly difficult for brands to gain any organic exposure on Facebook. The evidence from third-party research and Facebook’s own declarations make the future seem bleak for those who hoped that the free ride of social advertising might continue.
News from other social networks also suggests a shift to the ‘pay-to-play’ model.
  • Twitter
On average, tweets only reach around 10% of followers as they are quickly drowned out by other posts, according to data pulled from Twitter’s new analytics platform. As mentioned, Twitter is chasing greater ad revenues and planning new controls that dictate what content users see in their feeds, so it could be that brands find their organic reach is further reduced.
  • Pinterest
Pinterest is also finally moving towards monetising its platform through the use of ‘Promoted Pins’, which the company announced in June. There is currently a waiting list for this function, which may be the start of restricting what branded content users are exposed to. The company’s last funding round valued the business at $3.8bn (£2.3bn).
We’ve already seen that Google is willing to remove privileges from its free services – such as hiding keyword data in Google Analytics tools – in order to boost its ad revenues.

Organic social reach isn’t dead yet

In April, new research showed that while on average many Facebook brand pages have seen a drop in organic reach, the top 1% of pages still reached 82% of their fans. It could be that it simply comes down to producing content that is both relevant to your audience and tied to long-term business goals, rather than chasing virality and looking for quick wins.
Ultimately, it’s a matter of wait and see for how the social media networks develop their revenue streams and what this means for brand content.

David Moth is deputy editor at Econsultancy where he covers ecommerce and digital marketing. Follow him on Twitter: @DavidMoth

Monday, September 1, 2014

Website Management - It's not do-it-yourself if you're not doing it!

Anyone who runs their own business knows how hard it is to keep up with a myriad of day to day tasks that go along with business ownership.  When you’re juggling the responsibilities of being CEO, CMO and HR, chances are, your Internet presence is suffering.  Websites are no longer the simplistic online brochures that they were 15 years ago; they have evolved into full scale business solutions.  They are multi-functional tools that can be used to solve a variety of problems including those related to sales, marketing, logistics, customer service and public relations issues.  In the last decade, a number of free content management systems (CMS) developed to enable anyone to establish a marketing presence on the world wide web. The rising popularity of CMS such as WordPress®, Joomla® and Drupal® has led to a dramatic increase in the number of do-it-yourselfers -- especially small business owners who decide to take on the roles of web manager, search engine optimization expert and Internet marketer.   But, it’s not do-it-yourself if you’re not doing it. 

Website Management Door to SuccessForward-thinking entrepreneurs KNOW that a good web manager is worth their weight in gold, working with you to grow your Internet business presence while allowing you to focus on the other essential aspects of owning and running a business. Website managers open the door to success by maintaining your company image, directing sales traffic, managing consumer relations and promoting your virtual real estate.  The main focus of a website manager is to keep your website up-to-date and operating smoothly.  A well-maintained and marketed website will provide a steady stream of business.
Now that you know how a website manager can help you grow your business, it’s important to know what types of skills to look for in a website manager. An effective web manager will, at a minimum, offer the following website management services:
  • Website maintenance & repair
  • Internet marketing
  • Search engine optimization
  • Content management
  • Email marketing campaign development & deployment
  • Social media marketing & management
  • Reputation Management
Not only should your website manager offer these services but they must know how to use them in the most effective ways to benefit your business RIGHT NOW.

Website Maintenance & Repair

One key aspect of locating the right website manager for your business is to ensure that they offer website maintenance services. This requires specialized skills, including in-depth knowledge of hypertext markup language (HTML), cascading style sheets (CSS) and Java to properly manage your website’s code and scripts.  All website browsers use HTML and CSS to render text and graphics and nearly all websites use some form of Java to program or control a websites’ functionality.  Today computers, mobile devices and smartphones use a multitude of Internet browsers.  Poorly written code or invalid programming can cause your site to fail to function on one or more browsers. As importantly, this issue can delay or prevent search engines from indexing your website. While research shows 76% of consumers use search engines to locate local goods and services, error riddled websites are rarely included in search results due to their poor quality.
Since website management is such a critical job, it’s important to be sure that your website manager has professional level skills.  There are two free tools that can help you evaluate the skills and expertise of a website manager or website management company.  You can use these tools to examine previous work samples and see firsthand how effectively they market their own website.
Code Validation Tool1.) Website Code Validation Tool – Click on the link and enter any URL address to check a web page for code errors. This tool will display the number of errors on the page. The goal of any website is to have 0 errors to ensure optimal functionality and search engine indexing.

Rank Checking Tool2.) PageRank® checker – Enter a web address to see the ranking of any page. This rating is an indication of your online marketing reputation.  It’s also an indication of the online reputation of a potential web manager. You will want your website management provider to have a rank of 3 or above.

Search Engine Optimization

Understanding the nuances of search engine algorithms and optimizing websites properly is a mandatory skill for any website manager.
Internet Marketing Statistics9 out of 10 Internet users in the United States say that Google is their preferred search engine.  Imagine having your site banned by Google. This can be the result of working with an inexperienced or overzealous website manager.  And, just because your site is completely developed and optimized when it’s launched, the work of a web manager is not finished. Web maintenance is an ongoing process. Your web manager must stay abreast of the many regular updates and changes required to keep the site properly optimized. This is crucial because if your customers and prospects cannot find you by using popular search engines, your business will be lost on the over-populated Internet.
Effective website management will help your website content rank higher in search engine results. This is especially important since 75% percent of searchers don’t scroll past the first page of search engine results.  Consumers pay the most attention to results displayed at the top of the search engine results page.  If your business doesn’t rank well in organic (unpaid) search results, or if a search yields negative or unflattering reviews for example, you’re losing a lot of potential customers looking for trustworthy, quality businesses like yours. A good web manager can help by creating informative, useful, and interesting content that contains relevant keywords ensuring your business is positioned well in search engine results.

Social Media Marketing & Management

Website Promotion Statistics
To maximize the results of your website marketing strategy, focus your resources on the avenues with the highest return on investment for your business including a social media campaign. Are you a beauty supply company wasting valuable hours a week focusing on Twitter posts?  Perhaps you heard the latest buzz about Pinterest or Instagram so you devote a few hours a week to posting information about your building supply company.  Without social media management (SMM), however, you may actually be wasting your valuable time trying to attract the wrong audience.  A social media manager can help you to focus efforts on the best media platforms to bring the fastest return on investment for your business.
Reputation management is another key component of your social media strategy. Reputation management is something to think about before your business is negatively impacted -when orders dry up or your phones stop ringing.  You need to know what people are saying about your company in real time.  Monitoring social media can alert you when someone mentions you or your employees on popular sites like Yelp, Facebook or Twitter.  Staying abreast of what others are saying about your business puts you in the best position to respond in the most appropriate way to maintain your professional reputation. You may not always be able to recover the customer but you can absolutely prevent a reoccurrence.   
Businesses have a lot of options when it comes to website management.  Let IMC Website Management take care of  maintenance, repairs and promotion. We'll manage every aspect of your website including complicated upgrades, repairs and security. We don't stop there; we take care of getting the word out about your website by effectively marketing your site to search engines, directories and social media.

© 2014 Internet Market Consulting. All Rights Reserved. Publication rights granted so long as article and byline are reprinted intact, with all links made live.

Tuesday, August 26, 2014

Search Engines, Directories and More - Oh My!

Search engines – webmasters and search engine optimization (SEO) professionals follow their guidelines for the highest possible rankings on them; paid search marketers pay to be featured on them; and users turn to them when they're searching for answers, information, or entertainment.

Search Engine Watch has been covering search engines since June 1997 and has watched the industry evolve to its current state. Over time, many search engines have come and gone, as users have spoken with their keyboards (and literally with their voices – thanks to voice search technology).

In recent years, search market share has remained mostly unchanged – for much of the world, it's Google followed by every other search engine (in the U.S. the "Big 5" search engines consist of Google, Bing, Yahoo, Ask.com and AOL, which combine for hundreds of billions of searches every month). Meanwhile, many of the players have consolidated or have become footnotes in history.

What follows is an overview of today's major global search engines, with some history and explanation of why each one is important to webmasters, marketers, and users. We'll then review some of the top directories.

Major Search Engines

Google Logo


Started in 1998 as a university project by Stanford University students Sergey Brin and Larry Page, Google is now the dominant search engine by no small margin and that didn't evolve slowly.

In fact, in June of 1999 Netscape Search was updated and AOL/Netscape search began to be powered by Google bringing their search volume to approximately 3 million per day; huge for the time.

On June 26, 2000 Yahoo Selected Google to provide its organic search results (replacing Inktomi) with its impressive index of more than 25 million pages; again, huge for the time.

Google has since become synonymous with the word "search" and as most of us know, is often used in place of the word. Don't know the answer? Google it!

The continued strength of Google as a search provider is based on a large number of factors and won't be debated here, save to say, they have successfully provided the results people are looking for in a manner those searchers either enjoy or are comfortable enough with not to switch to a different provider.

Google continues to tweak their search algorithm multiple times per month and adjust the layout of their results to test for improved visitor experience and advertising revenue.

The majority of Google's revenue is derived from their AdWords and AdSense programs. In fact, advertising accounts for more than 95 percent of Google's earnings. If there is a weakness in the Google model this is it; they need to tweak their layout and results to promote the paid avenues of their offerings. This gives advantages to other engines who may have revenue generation strategies outside of search.

Bing Logo
Bing was launched in May 2009 as a fundamental upgrade from Microsoft's previous efforts into search, MSN Search.

Since the launch of Bing, Microsoft's share of the search marketplace has more than doubled. Add to that the deal between Microsoft and Yahoo for Bing to power Yahoo's organic results and Bing powers over 25 percent of search.

With the Microsoft/Yahoo alliance also came the affect that Yahoo's paid search platform would be used to power both Yahoo and Bing's paid results. While this may not seem like a big deal on the surface, it is actually huge.

Where once business owners and marketers had to consider whether it was worth the hassle of managing both a Bing paid campaign (for the significantly lower traffic they yield over Google) and also make the same call on managing a Yahoo paid campaign – the two now are manageable in one convenient location, significantly reducing the time it takes to setup and manage.

This of course makes the cost for these campaigns less expensive and when you combine that with Bing's increased market-share then they're in a position to take some of the ad dollars from Google (or at least, gain some for themselves).

YahooYahoo is an interesting search engine and one which, until recently, I had a very hard time taking seriously.

Once upon a time Yahoo was a major leader in the search field but has been in decline, making bad decision after bad decision, announcing layoff round after layoff round and making what can only be described as one of the worst business decisions in history when they turned down a takeover from Microsoft valued at $33/share. Yahoo shares dove after that and have never been anywhere close since.

From that point, until 2012, it seemed that every piece of news from Yahoo was bad news, until July 16 when the announcement came that they had snagged Marissa Meyer from Google to become CEO. This was the first move they'd made in a long time and had people wondering if this might just be the breath of fresh air and change of direction that the company needed.

From reviews of all hires and selling key properties such as Alibaba to putting their own search technology back on the forefront; Yahoo has maintained its position as one of the top three search engines, despite not producing their own organic results.

Other Major Search Engines Around the Globe

Google dominates the U.S. and most of the world – but not everywhere. Yahoo and Bing have had about the same luck (zero) making a dent in Google's search market share on other continents, but a couple of search engines in other countries have managed to stay ahead of the Mountain View, California-based search engine. If you're from these countries or interested in marketing to them – pay attention.

Baidu logoIn China, Baidu is the major player with more than three of every four searches conducted on their engine.

To say Baidu blends organic with paid search is misleading, they use a hybrid approach wherein they have pay for performance (P4P) results (users bid to have their websites place at the top of what would appear to be the organic results).

In addition, Baidu offers PPC which, similar to AdWords, is displayed at the top or right of the standard results. One could argue that the existence of the PPC-like results further confuses the users clicking on the standard results area into believing they are organically generated.

While some investors consider Baidu to be overvalued as a stock, their earnings are consistently high. For companies looking to market into China, understanding Baidu is crucial.

Yandex LogoYandex is the primary and most popular of all Russian-language search engines with significant market dominance in Russia.

On October 1, 2012, Yandex launched their own browser and mobile app to keep their position secure against Google, their only real competitor in the space.

Yandex's advantage in Russian seems to be based on an algorithm that performs much better in understanding the unique syntax used and integrating that into the consideration of what type of results the user is likely looking for (for example – is the search string a question or simply keyword entry).


Directories are an interesting topic. Do they carry weight? Can they hurt your rankings? Should you even bother? The answer here is yes, yes, and yes.

This section will mainly focus on general directories, but the end of this section does include a few tips on how to find niche directories (or even other general directories) and how to determine if they are worth getting a listing on.

The Yahoo Directory was started in 1994 under the name "Jerry and David's Guide to the World Wide Web" but in 1996 became Yahoo. At the time Yahoo was primarily a directory with search functionality and (interestingly) neither SEO nor Internet Marketing were even categories at the time.

Through the late 1990s Yahoo pushed to become a web portal and in 2000 even signed a deal with Google that would see Google power Yahoo's search functionality. Their focus at the time was to acquire users through acquisitions such as GeoCities (RIP), bringing more people into their portal and keeping them there. Unfortunately Yahoo! didn't have the same user loyalty that Apple does and the walled-garden approached failed as users Googled their way out of the Yahoo network of sites (ironically right on Yahoo's own properties).

All this said however, they still provide a solid directory (back to their roots). The cost is a non-refundable $299 review fee.

Best of the WebBest Of The Web may be my favorite of the general directories due in no small part to the fact that they allow for a permanent listing. The directory was founded in 1994 as a listing of the best of the web (seems to be the year of directories) and actually gave out a series of awards (take a peek, it's interesting to see what types of things won back then). That lasted until 1998 at which time the site lay dormant until purchased in 2002 at which time it became a general web directory.

BOTW is a human edited directory. They will decline your listing if they don't like the site. A submission is $150 annually or $300 for a permanent listing.

DMOZNo list of directories would be complete without DMOZ. DMOZ was founded in June 1998 as Gnuhoo. It was purchased by Netscape in October of the same year at which time it became The Open Directory Project. By April 2000 it had surpassed the Yahoo Directory in number of URLs in it's index and currently sits at about 5.2 million.

For those in the industry long enough to remember, DMOZ suffered a catastrophic failure in October of 2006 at which time they had to display a backup version of their directory. This wasn't remedied until December and new sites couldn't be suggested until January. This is the time when it seemingly became increasingly difficult to get a listing in DMOZ as any editors seemed to have found new things to do with their time.

It is still possible to get a listing in DMOZ. For the 10 minutes it takes, it's well worth the time and it's free to submit. (Tip: try to submit to a category that has an editor.)

business.comBusiness.com was started in 1999 as a search engine for business and corporations. They came close to bankruptcy during the dot-com bubble bursting but after major layoffs and restructuring they became profitable once more in 2003.

Business.com is focused on business-to-business resources (so take that into consideration when thinking about submitting. The cost is $299 per year and all submissions are reviewed manually.

As with Yahoo and BOTW, the fee is non-refundable if your site isn't accepted. You're paying for the review, not the link.

Honorable Mentions
Moving past the major players, there are also a number of other good general directories. These directories have all survived many updates including the Penguin and Panda rounds.

Remember, though, link building is about balance. You don't want to submit to a bunch of directories and consider your job done. A better strategy would be to bookmark this page, submit to a few and as you're building more links using different strategies, add a directory or two mixed in with the rest.

  • Jayde – Submission is free.
  • Ezilon – $69 annual fee of $199 permanent.
  • Alive – $75 annual fee or $225 permanent.
  • 01 Web Directory – Free submission option or $49 one-time for a guaranteed 3-day response time.
  • Aviva – $50 annually or $150 permanent.
  • SunStream – $29 annually or $49 permanent.
  • GoGuides – $69 one time.
Again, this list only contains consistently solid general directories.

Directory Guidelines

You'll want to also look at niche directories (which may well hold more weight than any of the general directories above), but you need to be careful. There are many horrible directories out there.

Here are a few directory guidelines to follow that universally apply:

  • Is the submission a guaranteed placement? If a directory will list you automatically (with or without a fee) then it's not an editorial link and either doesn't carry weight or likely won't in the near future. It should be avoided.
  • Do they require a link back? If they do (even for their free listings when a paid is available), it probably should be avoided.
  • Is their PageRank 3 or below? Yes, it's an old metric, but is still helpful to gauge general site health. A directory with a PageRank of 3 or less will, at best, pass virtually no weight; at worst, it'll cause you problems. Generally, you should only look at PageRank 3 directories in the case of niche directories; with general directories, don't even consider anything less than a 4.
  • Common sense. Ah, the toughest one because our brains can trick us into seeing what we want to see. When you look at a directory (or any other link source for that matter) you have to ask yourself, "does it make sense that this link should pass weight to my site?" If you can honestly say "yes" to this then it's likely a good link.

A Final Warning

The saying "don't put all your eggs in one basket" comes into play here. Once again, directories can provide good and relevant links to your site (and hey, even some traffic) but a solid link profile contains variety.

Never put all your energies into one single link source. If you find a lot of great niche directories, put them all on a list and add a couple each month while you're engaged in other strategies to help remind Google that you're not a one-trick pony. You have good content liked by directory editors, bloggers, social media netizens, and others.

by Dave Davies