- Online Marketing
Not long ago, online marketing was a limited affair and was explored by already successful or adventurous businesses. Likewise, search engine marketing(SEM) was restricted to an elite few. However, things have changed now to a considerable extent and these two marketing mediums are no longer restricted with some few.
Search engine marketing, in particular, has seen triple-digit growth by gaining access to marketers as well as rewarding, serious streams from billions of users. It is a continuing and a bit complicated process in its entirety. However, these complications can be resolved by seeking expert services of an internet marketing agency. Search engine marketing is like an efficient way to bring in sales leads. It is almost a collective form of newspaper ads, yellow pages, direct mail, and classifieds rolled into a single pinch.
Search engine marketing is concerned with finding and submitting the most sought-after keywords to search engines such as Google and Yahoo! This is because by using these keywords, a business website can prominently figure in the search results, leading to credibility, market exposure, more customers, quality leads, customer satisfaction and better sales.
SEM can help a business to enhance its brand profile, acquire new customers, retain existing customers, gain industry exposure, increase web traffic, and generate quality leads and increasing online sales.
However, a business must ensure that its objectives are not killed with keywords by slicing and dicing. It is also important for a business to realize that managing rankings is a full-time job. Furthermore, it must also realize that a combination of search engines to successfully reach customers is the best way for it to gain market exposure in the long run.
Once a business realizes these facts, there is nothing that can come between it and success in the online world.
Great article from Searchengineland.com about making sure you have complete coverage on any given SERP.
When marketers have scrutinized Google’s research on how organic and paid search results work together — the search giant concluded that nixing the paid ads would result in a 89% drop in clicks — it’s been clear there’s more to the story. What happens if your brand is the top organic result for the keyword? Surely the results would be different than if your organic result was on the second page?
“When we released the first paper, we had a lot of questions coming back, asking more more details around incrementality and under what situations can you expect different numbers?,” said David Chan, Google’s lead researcher for this study.
So, Chan set out to research more subtleties in the interaction between organic results and paid search ads, and today released new results.
The 89% number makes more sense now that the new results show that paid search ads appear without an accompanying organic search result on the page 81% of the time, on average. Only 9% of the time does a search ad show with an organic result in the top rank. An organic result appears in ranks 2 to 4 5% of the time, and in lower ranks (below 5), about 4% of the time.
Though the researchers didn’t specifically look at branded versus generic terms, Chan said,the ranking is a good proxy, in certain cases, for branded versus generic terms. In other words, the brand’s organic result is likely to appear higher, if it’s a branded term.
Surprisingly, even when brands’ organic terms are ranked number one, they get 50% more clicks, on average, when there’s an accompanying paid search ad.
“It is a very surprising result, and, I think in some ways, it runs counter to what people would think but the data speaks for itself,” said Chan.
The study found that 82% of ad clicks are incremental when the associated organic result is ranked between 2 and 4, and 96% of clicks are incremental when the brand’s organic result was 5 or below.
This past weekend, I had a friend call call me to talk about his latest SEO scheme. It was this topic specifically, he had paid five dollars for a back link program from .edu domains. I had advised him that paying for back links from a website that had nothing to do with his business wouldn’t benefit him very much after Google’s 30 pack update on January 5th. Google has become very wise to back links and the spamming that is associated with them.
On the heals of that conversation Search Engine Watch posted the following article.
The process has already started, and as a publisher you need to make sure you are adapting your marketing strategy to line up, or get left behind.
Google made the link building algorithm popular in the late 1990s and early 2000s. It was a revolution in its time because it provided search engines with a method for identifying the most important web pages for a given topic. However, as has been well documented, spammers have assaulted the algorithm with a wide variety of methods for buying links or creating them in other ways that don’t work for the algorithms.
Even if you generate all your links in a pure white hat way, through reaching out to site owners and requesting them without compensation, or are doing high quality guest posts, you aren’t necessarily generating the best possible signal for search engines. Certainly this type of link building done properly would not be a violation of the Webmaster Guidelines, but from the perspective of the search engines it also doesn’t represent a groundswell of opinion raving about your product. It still means something, but it is brute force driven through your efforts, rather than resulting from the enthusiasm of your audience.
I don’t believe that search engines will penalize people who link build this way, but I think they will value the link profile that is manually built less than one that obtains unsolicited endorsements from the web.
Prior to the emergence of Google, links weren’t a ranking factor in a significant search engine. At that time, any unpaid links were implemented solely based on merit, because the publisher had no other reason to link to someone else’s page. Even paid ads were based on the advertiser valuing the traffic from the target site enough to be willing to pay for it, since there was no other benefit – so these too went to highly relevant pages as a rule.
Short and simple: links were a better quality signal when the world didn’t know that they were a signal. But, those days are gone.
The search engines are constantly in search of additional signals to help provide better data on the best results to return for a given query, and to make it harder for spammers to succeed in ranking lower quality sites (lower quality than others that are available on the web). The increase in the use of social signals by the engines has been a part of that effort.
However, social signals are relatively noisy. As I documented in “Social Signals and SEO: Focus on Authority,” the number of people on the major social sites that are actively recommending sites/content is still a relatively small percentage of the population.
That same article also documented how using social media’s “wisdom of the crowd” (showing the most liked articles) was something that Bing tried, but then later removed. I believe that this happened because using social media mentions as votes in the same way that links were used did not really work, even in the limited fashion that Bing tried it.
I expect that for many categories of searches search engines will weight sites that show multiple types of signals more than those that show only one. Back in July I wrote about “The Dangers of a One Dimensional Link Building Plan.” However, in addition to not doing one type of link building, you should also be careful to not use old-fashioned link building as your only method for promoting the site. Find a way to get the web to generate other signals about what you are doing!
The first key is to focus on where your audience is (what sites they visit, what videos they watch, whose columns they read, …). Think like a pre-Internet marketer would when trying to decide how to spend their ad dollars. Ranking signals can be generated by both your potential customers and the publishers of the content on the web that they visit.
Potential customers can create signals by:
There are a lot more methods than these few!
Publishers of the content that your audience consumes can generate signals as well, in the form of good old-fashioned links. So what are the ways to encourage the generation these types of signals?
As per my recent columns, you should certainly focus on authority, and seek to become an authority. Even if you aren’t yet an authority yourself, you can do things to get your name out there to start getting exposure to authorities and to build visibility with others. Here are a few specific ideas on how you can do that:
Regardless of where you are in the process of building your own authority, do some things to attract positive attention to your website. Participating in discussions online is a great place to start. Participating in offline discussions that you can use to help drive online interactions is also a great thing to do.
The key is to create great signals in addition to the links that your site attracts.
Wikipedia: algorithm definition: a procedure for solving a mathematical problem (as of finding the greatest common divisor) in a finite number of steps that frequently involves repetition of an operation; ”’broadly”’. →
This has been a year where search marketing truly matured. Mobile became one of the largest growth components, with its share of the overall search market now reaching 20 percent.
Looking ahead, expect 2012 to be the year when we push further into the Mobile Semantic Web 3.0, which presents opportunities that will drastically change SEO as we know it.
There will be “Four Horsemen” of an SEO apocalypse in 2012 and they will all be carrying a mobile device.
Quick Response or QR codes in 2012 will be near ubiquitous both online and offline in the U.S. to easily send people with mobile devices to online destinations.
QR codes are increasingly being used on billboards and various signage as nearly 50 percent of U.S. mobile users have a smartphone. Mobile devices themselves are ubiquitous with their users and are used often in tandem with using other media, such as magazines and television.
I envision these pixelized matrix barcodes present in the corner of all televised programs, especially news. Viewers will then be able to receive and share the latest updates on that televised news or programming with their mobile devices online. Television use will also be the medium which will explain how to use QR codes as its being used, so it takes hold in the mainstream understanding.
Ultimately, QR codes will then replace or be in tandem to URLs on all offline advertising and even online. This will cause search engines to rely less on text links and factor in interest from QR code use. Google and Bing can directly acquire this information from mobile users on their operating systems, Android and Windows Phone 7, respectively.
There will be reduced dependency on text for search and logins in 2012, especially on mobile devices.
Voice actions on Android and recently Siri on the iPhone have provided users a more direct way of getting answers and producing actions on their mobile devices. Just as Google used its free 411 service to build the database for voice actions, I feel it will, in turn, use the database from voice actions as a free/paid service to business users for logins and purchasing online by just using your voice.
Another means of logging in for mobile users will be touch-friendly image combinations that can also be used to refine certain search results. Voice capabilities at times avoid the need for search engines and/or present the need to change search tactics.
Search will be further fragmented in 2012 as mobile search is itself already fragmented by device-type to encompass feature phones, smartphones and tablets – with each offering different results not only between one another, but also with desktops.
The current landscape of mobile search to desktop is outlined with data and action items in a recently published Covario mobile SEO white paper. I envision that the search results of the three mobile types, as well desktop, will differ even more in 2012, as consumers’ ultimate search intentions and actions are different based on the device they are using.
A segmented search approach is needed, especially in mobile, as sites will need to be properly designed for each device type and optimized for the device user’s intent.
According to TopSEOs recent release of the SEO firms in the North America, ReachLocal is ranked in the top 10 SEO firms. If you have been reading my blog, or site for any length of time it should be clear to you that ReachLocal is not in the SEO business. With that stated looking at the ReachCast product and the SEO benefits that the product has, based strictly on the design & implementation of the Cast Platform, SEO is a strong benefit of the product.
ReachCast was designed to be a social media product, however customers that have used the Cast product have seen the SEO benefits take on a value proposition of it’s own, in fact so much that ReachLocal is being recognized at the 8th best company for SEO. Please see the image below.
Since its introduction in 2002, topseos has been identified as an independent authority on vendors who supply internet marketing products and services Our mission is to offer comprehensive and independent advice to assist buyers in making purchasing decisions from internet marketing vendors.
We pride ourselves on a disciplined research process that has us regularly engaged with the companies we evaluate. Our proprietary analysis tools and methodology, developed over multiple years, includes an extensive rigorous evaluation rating system that is applied to each company that is identified and researched. We gather information about products and services, about consumer demand in the marketplace, monitor industry-wide trends, exhibit and meet countless firms at industry tradeshows, and often times even visit firms that are evaluated.
The folks over at Search Engine Land posted a great article about SEO and the changes that are going to be needed in one mans opinion. I thought it was a good read so I added it here.
Search Engine Optimization is growing up. I am not ready to say the Wild West SEO days are completely eradicated, but in 2011 good search engine optimization is less about trickery and more about engaging content and audience development than ever before.
Over the years, quality optimizers have become more prone to avoid technical tricks like using CSS image replacement to inject keyword text or controlling the flow of PageRank by hiding links from search engines.
Search engines keep getting better at crawling and indexing. If you are unwilling to burn your website or risk your career, you follow the search engines’ terms of service.
During 2011 the conservative attitude toward code crossed chasm to apply to content. For years, websites churned-out poorly written, generic articles in the name of long-tail keyword optimization. It worked so well some people turned crappy content into startups.
Now, thanks to Panda, Google’s site-wide penalty for having too much low quality content, people are asking why anyone would put pages on a website that no one wants to read, share or link to? Without taking potshots at the past, most of those articles look juvenile and antiquated.
Made in Japan went from signifying cheap to marvelous. Made for the Web is growing-up too. It is this evolution which guides my SEO highlights for 2012. I separate things to keep in mind by code, design and content.
While Google likes to tell us they are very good at crawling and understanding imperfect code, I prefer to assume search engines are dumb and help them every way I can. Simple code is honest code. It’s also easy to parse and analyze. Just because you can AJAX-up a page with accordions and fly-outs does not mean you should. The more code on a page, the more things that can go wrong from spider access to browser compatibility.
Follow standards and get as close to validated markup as reasonably possible. Make it easy for search engines to spider your site. Validating HTML and CSS does not automagically raise your rankings, but it will prevent crawl errors.
Make your CSS class and ID names obvious, especially for section div tags. Again, Google tells us they have gotten good at identifying headers, sidebars and footers. Part of that is almost assuredly knowing the most common div names.
Why would you name a CSS Class xbr_001 when you can name it navigation? At the very least, it will make life a lot easier on your SEO team. They have enough work without the need to translate ambiguous naming structures.
Reserve h# tags for outlining principal content. I am amazed at the number of big brand websites that still use h# tags for font design. Tell your designers that h1, h2, h3, h4, h5 and h6 are off-limits and reserved for content writers and editors.
The only exception to this should be if your content management system uses h1 tags to create a proper headline. Embargo h# tags out of your headers, navigation, sidebars and footers too. They don’t belong there.
Look at the Zen like efficiency of any Apple product. Steve Jobs was ruthless about eliminating the unnecessary and achieving clean Bauhaus efficiency.
By contrast, too many websites, especially enterprise sites, try to be all things to all people. Their administrators or managers fear they might miss out on a conversion for lack of a link.
Websites should have clean vertical internal linking. Every page should not link to every page. You do not need a site-wide menu three levels deep. As long as people feel that they are progressing toward their goal or the useful information they seek, they will click on two, three or four links to get there.
Look at your website analytics. Which pages receive the fewest visits? Are any in your navigation? If no one uses a link, why does it to be there?
A website’s most widely visited pages tend to be close to the homepage. Review your categories and sub-categories. Can you eliminate whole categories by merging or reassigning content? For example, does the management team need its own category or can you move it into the About section?
This is not just about eliminating distraction. It is a way to increase the internal flow of authority (PageRank, link juice, etc.) to SEO hub pages.
Emphasize Community and Conversation. If your business depends on the Internet and you have the budget to hire one more person, consider employing a community evangelist. High rankings require authority. Authority comes from off-site links and, to an extent, brand mentions.
Search Engine Land has published a great article that I have been preaching for years. Often times it’s not the SEO or SEM that isn’t working…it’s your landing page.
Landing pages are frequently pushed to the back burner when creating digital marketing campaigns even though they can often have the single greatest impact on the campaign’s success. This is particularly surprising considering landing pages have been a frequent topic that’s been written about and discussed for years.
Whether due to a lack of understanding about how to measure landing page performance, a lack of knowledge about how to create and test landing pages, or a general lack of awareness of the benefits of good landing pages, the fact remains that many marketers are running campaigns that are performing significantly below their potential.
Even when marketers are fully aware of the importance of landing page testing, the testing plans that get implemented are often lacking. A main reason for this is the narrow scope of testing that occurs.
When a digital marketing campaign launches, it often includes search advertising, display advertising, and social advertising – yet each medium tends to use the same landing page. Unfortunately, this leads the landing page testing down a path that will optimize for the largest segment instead of optimizing each segment individually.
The result is a campaign that doesn’t maximize the return on investment.
In order to have a robust landing page testing strategy, each segment that is targeted in the campaign needs to be accounted for and a testing plan needs to be implemented for each. The reason is that the behavior and needs of each segment are not the same, so focusing on one segment essentially ignores the needs of the others.
In the campaign example given, the three mediums used as part of the campaign are search, display, and social. Given the different types of activities being performed by users when they are exposed to each type of ad, you’d expect the behavior and response rates to differ.
For example, when a user conducts a query on a search engine, they are actively looking for information that your ad and subsequent landing page should be able to provide. With display advertising, you only know that the users may be interested in your offering based on how you’ve targeted the ads. They are not likely, however, to be actively looking to make a purchase at the time of their exposure to the ad. The same is true for social ads that drive users to your site with the hope that they will convert. They tend to be engaged in other activities and so a purchase decision may not be an immediate step that these types of users would take.
The calls-to-action tested for each segment should vary.
Even within a particular target segment you will likely see significantly different user behaviors. Search, for example, will likely have early shopper keyword groups as well as keyword groups that tend to close the purchase. Having a single landing page for both groups will not allow you to realize the maximum return on investment.
Website publishers have had about three months to react to Google’s “Panda” algorithm update, which primarily targeted “content farms,” sites that the Web search giant has long aimed to squash from its results because it views their information as basically identical to that found on other sites, or just too sparse to be useful.
Google claimed that Panda, at least in its initial late February release, affected only about 2 percent of U.S. search queries. However, the dramatic hits taken by some well-established sites dominated headlines in the Search Engine sector for months after the rollout.
“When a client’s site is pretty clean, it’s been a pretty easy cleanup,” said Dave Davies, the CEO of Beanstalk, an SEO consultancy based in Victoria, British Columbia, Canada. Davies did note that Panda has been “huge” in terms of impact, but sites are making a comeback. “Google is not really interested in punishing sites long term; it is just trying to protect its results,” he said. “So, if you correct your issues, you can re-build your reputation.”
Davies and other SEO and Search Engine Marketing (SEM) professionals who spoke to IT Business Edge suggested that if an otherwise reputable site took a hit after the Panda rollout, it probably was because it:
“Google doesn’t think like your business,” said Matt Law, the founder of Law Marketing Systems, an Internet marketing and SEO consultancy based out of Orlando, Fla. “They think like a bunch of California yuppies who run the Internet. And they do.”
In this first part of our two-part series, we’ll look at the issues surrounding the “Panda penalty” for duplicate content and the best tactics for ensuring Google and other search engines identify your site pages as unique. In part two, we will look at the general guidance Google and SEO experts offer for making sure your content is valuable enough to rank highly in search results. We’ll also take a look at how Google may be using user behavior and social interactivity to gauge that value.
So, what is the big deal about duplicate content, anyway?
In my job I talk with many companies that “Have an SEO guy.” I always talk with them about the great sides of SEO and we on occasion talk about the down sides. Anyone that knows me, knows that I am a big advocate of SEO and SEM together provide a great mix. In fact I talk to alot of my customers about using Denver Media for some SEO help on the campaigns that I run. Some times I run across a customer that has found a Black Hat SEO guy that is performing miracles, generally it doesn’t take to long before the site traffic dissipates and then can’t be found. Please read this article if you are considering SEO and make good choices, after all your business my depend on it.
Computerworld – Being at the top of a search engine results page can mean the difference between business success and failure. So, what would you do to ensure a listing there?
If so, you could be walking into a minefield.
Search engine optimization (known as SEO) involves actions intended to get your page listed higher on a search engine results page. In the past 15 years, SEO has evolved into a complex art, one that is now the foundation of many businesses.
The problem is that there are ways of trying to improve your standing that are considered legitimate by the search engine companies like Google, but there are also methods that can get you into trouble. Google (which receives 90% of the world’s search engine traffic, according to StatCounter, and 65.4% of the U.S. market, according to comScore) does not appreciate being gamed — and will retaliate.
Just ask $17 billion retailer J.C. Penney, which got caught using black-hat (i.e., illegitimate) methods to boost its search results during the 2010 holiday shopping season. Penney was accused of taking part in a so-called link scheme, probably the most complicated black-hat SEO technique.
“Our high [search engine result] rankings were pushed down,” Darcie Brossart, Penney’s vice president of communications in Plano, Texas, confirmed concerning the sanctions Google imposed. “We have terminated our relationship with our former natural SEO firm. We don’t know how it happened. We did not authorize it, and we were not involved.”
It’s important to recognize if your SEO firm (or your in-house Web expert) is venturing too close to the edge of the black-hat cliff — because if Google or other search engines find there is some hanky-panky happening, it’s your site that will suffer. “I’m not saying everyone is doing it, but it’s not unusual,” says Vanessa Fox, former Google Search employee and author of Marketing in the Age of Google. “A company might hire an SEO firm without knowing a lot about SEO, or they might think it’s not risky,” she adds. (Google publishes advice for those considering hiring SEO firms.)
I enjoy articles like this because, so many customers fall for this pitch, and they are normally some of the hardest clients to get on board. Over coming the “victim of a scorched earth” syndrome is very difficult. I think that the more you can educate yourself before making a purchase like this, gives the customer the power to not get burnt, and avoid the syndrome completely.
I received an email this week from a customer that was considering using an online SEO (search engine optimization) agency that guaranteed first page Google results.
For those of you who aren’t web savvy, search engine optimization is the work you do as a company to be found on the left-hand side of Google search results (also called organic or natural results). These are the areas that Google feels are most relevant to the person’s search results. Any of the shaded results you see at the very top or to the right are paid by vendors (like advertising).
Before I go into the idea of guaranteeing online results, let me take you back about 20 years. In high school, I used to sell glasses at an eyeglass replacement store at our local mall. After selling a child, woman or man glasses, I would offer them our special “scratch proof resistant” package for $29.97. This special package enabled us to put a coating on each lens that would protect the glasses from scratching.
Little did the shoppers know that the coating was already on every pair of lenses we sold. It came standard. The whole “package” was a sham.
Guaranteeing Google Rankings Is a Sham
Any business owner wants the easiest and most effective route to online marketing success. The only problem is this: Success with search engine rankings and social media is hard, and it takes dedication and time to see results. There are no easy fixes.