SEO Client Conversations : The CC (Client Conversation) Series

I have been working on SEO projects since 2000 and have had many interesting , funny, enriching and mind boggling conversations regarding SEO with the various clients since then. I must say that if SEO and the Web have evolved then alongside the client’s approach to SEO has also evolved and the kind of questions the clients have been asking and seeking answers to has also progressed since 2000.

Client Conversation Series - WebPro Technologies SEO Blog

We keep on reading a lot about the latest algorithm updates, SEO News and almost everything about SEO and related to it but the client side thought process is rarely discussed on the blogosphere. I think if we discuss this and share our conversations then the thoughts and fears of the much involved netizens who read the SEO blogs and also those who are totally unaware about what search and web optimization is all about, will come into the limelight and help us from a SEO third party perspective .

I start this series with one of the conversations I recently had with a reputed client who has since 18 months have been very cooperative and have appreciated the SEO efforts put in by us.

The site statistics are as below since we started working on the site in June 2012

The site statistics after the site started improving after the initial SEO efforts in October is as below:

As you can see that the site despite the Penguin, Panda and the recent launch of the Humming Bird Engine has remained stable and is bound to improve further with the content strategies planned for the future.
But, the fears and questions the client voiced lately is as follows:
Client:

I just came to know that our competitor site is getting more than 1,00,000 visits per month. Please let me know why our site is not doing that well.

( After I went thru the competitor site I came to know that the site was another initiative of a very famous site which had a very good reputation and a very large link profile.)

Me :

Look we cannot compare our site to the one mentioned by you . Despite the fact that the sites relate to the same topics but their external link profile and popularity is very different . The competitor site is benefitting from the reputation and popularity that they have built over a period of 8-10 years. But our site is still 1.5 years old and has to yet gain popularity and a good Word Of Mouth over a period of time. Instead of focusing on how many visits the competitor site is getting let us focus on how to improve the services that we offer online and work on strategies to improve the outreach via social media and blog content . We should also think of how to keep in touch with the existing base of online existing customers . Talking about why the competitor is getting more visits will only mislead us and lower your confidence. Maybe with the kind of TV Ads and offline advertising the competitor is investing in this the no. Of 1,00,000 (Which you consider very high) maybe very low in magnitude as per their expectations.

We cannot judge the success of any campaign with only one metric. If we compare the two sites we know that we do not advertise on TV which requires a very big budget but they do. Their parent company is online since nearly 10 years hence they already have a database of regular visitors and link profiles. But, we are a 2 year old company and did not have any prior online presence and offline presence has been very local – in fact at the city level and not at a national level too. But with the website now the company has a global presence .

Client: 

Yes, but the visits have to increase ...

Me:

Yes, after achieving the basic objectives of optimizing a site such as sorting the technical issues of the site and completing the major on page tasks like it has been done on the site, we have to work on the social media and content strategies. Our main objective now should be to see that the visitors coming to our site are getting the required answers to their queries as you can see the no. Of landing pages have increased hence the visitors are reaching their destination page with just one or two clicks so we have to see that they get the required information on those pages which have a good recall value and a good first impression so that they recommend it via social media channels and also post reviews. All these signals will not only increase visits via other sources but also improve the search presence and thereby proportionately have a positive effect on the visits via search engines too. Hence, if we work on the quality factors then the visits will increase but if we discuss why we are getting comparatively low visits then we are wasting our time because as compared to our previous presence we are doing quite well so we have to forge ahead in the same direction. We can achieve our goal only if your expectations are realistic and our efforts are in the same direction.

Client:

I think you are right let us meet next to finalize the social media and content strategies.

The main objective of starting these client conversation series is to share my experiences with other SEOs so that we get a chance to learn from each other and are prepared when we face a similar situation which has already been faced by someone else. Please write to us at info@webpro.in if you would like to share your client conversations on our blog.

As an SEO what would be your approach to the situation mentioned above? Please share in the comments below.

I hope these series will be informative and be helpful to our readers.

Enhanced by Zemanta

We Have Moved To Our New Office (Some Pics.)

We are happy to announce that we have moved to our new office.

Our new office address is as follows:

WebPro Technologies

802,Astron Tech Park,

Opp. ISKCON, Satellite Road,

Ahmedabad -380015.

Gujarat.India.Some Pics. 

Its Diwali and we shall be on leave till the 10th Of November 2013. Hope to get back with enlightened enthusiasm after the Diwali break.

Enjoy your Diwali with lots of lights and love of near and dear ones. Hope Goddess Lakshmi brings in peace and prosperity for all.

Enhanced by Zemanta

Understanding The Purpose Behind Each Google Algorithm Update Is More Crucial For SEO

 Understanding The Purpose Behind Each Google Algorithm Update Is More Crucial For SEO

Google turned 15 last month and Amit Panchal posted on the Google blog that:

"We’ll keep improving Google Search so it does a little bit more of the hard work for you. This means giving you the best possible answers, making it easy to have a conversation and helping out before you even have to ask. Hopefully, we’ll save you a few minutes of hassle each day. So keep asking Google tougher questions—it keeps us on our toes! After all, we’re just getting started...."

Google from the time it started as a search engine has always focused on search as their main activity. Social media initiatives (Especially the Google+ initiative) have primarily been made to add quality social signals to improve search results. In fact I think each and every Google product along with the search algorithm updates are directly or indirectly connected to the quality of search results they display on the Google search engine.

The latest talk of the SEO town - 'The Hummingbird Update' points out the latest effort made by Google since it started in 1997. Every update since then has been made with a goal to improve the quality of search results and display the 10 blue links as per the search made by the user in the query.

Google has remained focused on one main goal i.e “To give quality search results to the user” . Google has been trying to reach closer to this goal with each algorithmic update . But as Google is one entity and its users include each and every human on this planet , every update gets interpreted differently by each one and also gets implemented differently by each one.

Let us go in the past and dissect a little:

In the late 90’s when Altavista was the main search engine being used by people and word to word mapping was the sole ranking factor on search engines people spammed their websites by repeating the keywords and also by hiding the keywords by camouflaging the font colour with the background colour of the page. When Google came up with the PageRank Technology to beat the keyword spam Google became THE SEARCH ENGINE and churned out high quality search results which made people discard Altavista and make Google their constant search companion.

The PageRank Technology (An effort by Google to improve search results for the user) is not bad but the way it was interpreted and spammed by the people to rank better is what is bad. The PageRank Technology was spammed so much by people that a new industry namely the LINK BUILDING INDUSTRY came into existence. This link building went to such an extent that people paid a huge amount to buy links. This kind of link spam increased so much that the main purpose of the PageRank Technology was defeated and Google in order to beat that had to recently come up with a Penguin Update and the disavow tool (An effort by Google to improve search results for the user ) which penalised websites with low quality and non-topical links.

The Penguin Update made people again stand up on their toes and made them undo all the paid and irrelevant links they had to their websites for which they again paid a huge amount. i.e first they paid to get links to remain in the search results and now they are paying to remove those links to remain in the search engines.

The PageRank Technology remains the same and is also one of the main quality signals which Google gets about a website but as the people spammed it the essence of the whole technology evaporated and resulted in unnecessary clutter on the web diluting the quality of the search results.

When Google integrated social signals (An effort by Google to improve search results for the user ) in their search algorithms the whole web took an about turn to social media sites to remain in the search results. But, again just being on social media is not enough the social media profile needs to send quality signals to search engines in order to include them in their search results. Of course what social media signal parameters are integrated in search is a very debatable issue . But, in order to get first hand social media data Google came up with their own social media platform i.e Google+ and every business today even if they are not active on Google+ have a Google+ profile, but is that enough ? The answer is an emphatic - “NO” and beating the spam on social media is the next big challenge for Google .

Apart from signals to the search algorithms via links, social media, meta tags, etc. Google currently is focusing on semantic search which focuses on the meaning of the content in the search query and in the content of the websites in the Google index. For this Google has introduced the Knowledge Graph , The Authorship Markup and now the Hummingbird Update.

Google started with a goal to give quality search results to the users as per the technology standards available for hardware and software at a give point of time but as it went ahead it also had to work at beating the spam which people kept on bringing on the web to just remain in the search results . This of course makes Google remain on its toes and makes Google keep working on becoming better than it was previously, which is in fact Good and bad for Google – Good because all the spam Google has to beat and face makes Google remain humble which is very much necessary as the kind of monopoly Google is having for search it can be very dangerous and unhealthy for the entire web. Bad because unnecessary clutter keeps on getting added on the web in the form of spammy links, low quality Content, a bad reputation for the SEO industry, misleading and wrong social signals, mistaken identities, etc.

Google has been focusing on the term QUALITY right from the beginning but people have been interpreting the meaning of quality as per the algorithms instead of the true meaning of the term. We have to understand one cannot buy reputation and popularity but we need to earn it, we cannot buy friends, followers and people to include in our circles but we need to make friends by the way we interact , connect and help them. Just adding content is not enough but every piece of content needs to be informative or offer solutions for which people are searching on the web. The content, links, social media signals and the overall online reputation resulting from the total footprints made by your overall online presence from the time you created an online identity will determine your knowledge Graph which should become stronger with every Google update rather than make you to undo stuff to remain in the search results. After all the online world is a reflection of the real world and the same rules pertaining to life and business are applicable.

A very good real time example of manipulation of schemas on the site came up during one of the sessions at SMX East 2013 held recently in the first week of October which I attended.

Chris Silver Smith, President, Argent Media (@si1very) during his presentation mentioned that as Google is giving more importance to rich snippets and schemas it is a good practice to add the selective Google reviews on your site using schemas. But in reality this again is like the misleading signals that the spammy link building efforts generated . The PageRank which got misinterpreted by some generated a link clutter on the web and created a link of all wrong wires getting connected which resulted in the short circuit of quality rather than lighting the bulb of quality due to the link building connections. Similarly adding random reviews on the site though in the form of schemas will not achieve the purpose of semantic search. The ideal way to add the reviews and rating on the site is to add a tool for rating and reviews on the site which the users can use and express their views so that Google gets a true idea about the product or the service the business is offering and correlate it with the user experience and feedback to ascertain the quality signal.

Quality signals can be ascertained by Google only if they get the aggregate feedback of any business online via the user response on the site. When any website owner filters the reviews and adds the selective reviews on the site then it surely falls under the SPAM category.

This was in fact pointed out strongly by Pierre Far of Google who was attending the session and could not resist himself from putting this point forward that any kind of such efforts are not as per the norms of Google.

Many times unknowingly a hasty implementation of any kind on the site can harm the site rather than healing it for a previous update or prepare it for a future update as explained in the example above. Hence, the right interpretation of the Google updates and understanding what Google is trying to achieve with every update should be a priority for every SEO before going on an implementation and an executing spree.

Content , Branding, Research and Strategy As The Root Of Any Internet Marketing Activity

The web has come a long way since more than a decade now and it has evolved technically and also there are behavioural changes in the way people use the web. In 2000 when the internet started being used on a large scale mainly for communicating via email the online marketers focused on email marketing and banner ads. on Yahoo, mainly because every second person had an email id on Yahoo and that seemed the most logical way of reaching out on the web in that era.

Later in mid 2000 when Google updated their search algorithm and people started using Google for exploring the web and searching the web for information SEO became the norm for all the businesses aiming for establishing an online presence.

When Facebook became the rage for remaining connected social media emerged as the latest norm for retaining web presence and reaching out on the web. People started focusing more on their social media presence further as search engines incorporated the social signals in their search algorithms and also displayed those signals in SERPs.

As time progresses search agorithms, online human behaviour and search results evolve hence In 2013 for any business to have assured web presence it needs to have an overall multifold presence on the web and should avoid putting all the eggs in one basket.

Branching out to all the options available for web presence is the most sensible decision. The internet as we know is a network of networks we need to understand that every form of web presence sends signals and connects each web presence and makes each footprint stronger. It forms a chain and creates a network with your website being the hub as the purpose of each online marketing activity is to attract targeted visits to the website either via push marketing methods or via making them reach your website via any inbound marketing activity.

From the SEO perspective too each and every online activity to create web presence makes the search presence stronger as search algorithms get signals from social media in the form of likes, shares, +1s, etc. and via blogs and other content media platforms in the form of comments and ripples the content creates when more and more people share it on their social media profiles.

The website is the trunk of the tree it is the connecting medium for the roots that is the architecture of the site which facilitates the to and fro transfer of information and the branches are the various marketing activities which help the web presence to branch out and help reach out to bear the fruit of ROI.

Content , branding, research and strategy are at the root of any internet marketing activity.

Any tree can be strong and bear the right quality fruit only if it has strong roots hence if we focus on creating relevant, benefit driven (for the user) and organized content , have a clear, compelling, focused brand message, devise a strategy for attracting , engaging, converting and multiplying visitors to the website which is possible only by researching our audience and competition.

The latest video by Matt Cutts (Head Of The Spam Team At Google) also focuses on why we should diversify and have a web presence mix rather than focusing on just one aspect of web presence.

Enhanced by Zemanta

SMX East 2013 Insights On Topics Related To Organic Search For SEO 2014

SMX East 2013 was held soon after Google’s 15th birthday where Google had  made 2 big announcements:

  • 1. Secure Searching Being Made Default For Everyone
  • 2. “Hummingbird” search algorithm is live and especially designed to handle complex queries.

These announcements have created quite a stir in the search marketing world and at SMX the discussions for organic search revolved around the following major topics:

  • 1. Not Provided Keyword data
  • 2. Hummingbird Algorithm
  • 3. Mobile is the future
  • 4. Where is SEO going in 2014

Major Takeaways From SMX East 2013 Related To Organic Search

About Not Provided Keyword Data:

  1.  100% “Not Provided” keyword data in GA will soon be a reality.
  2.  Accept it, get used to it and move beyond keyword data for search analysis.
  3.  Focus on the “Search Queries” data provided by Google Webmaster Tools.
  4.  Google is working on getting better and Google’s take on “Not Provided” is that we only want to respect the privacy of the user and protect the user.
  5.  But as Greg Boser rightly pointed ...” what blows me off is that retargeting is considered cool but organic search data is not.” as advertisers continue to get the keyword data.

About Going Mobile

  1. Mobile is the next big thing. Hence, go mobile.
  2. By end of 2013 there'll be more mobile devices than people.
  3. The conversion rate of smartphone vs. desktop is .3x vs. 1x. The challenge for marketers, then, is understanding how conversions work.
  4. Have a direct connect with your audience across all devices. Do responsive design and also offer an App.

About Structured Data , Entity Search And Authorship Markup

  1. Entity search is about understanding the meaning of the search query and goes beyond just mapping word to word for giving search results.
  2. Search is heading to authorship and semantics.
  3. HTML was made to determine how a page looks and not for what it means.
  4. The semantic web has a Data Tier, Logic Tier and the Display Tier.
  5. Implement Twitter Cards and Open Graph on websites.
  6. Authorship Markup is one of the main pillars of the semantic search era.
  7. 90% of the data on the web has been created in the last 2 years.
  8. Authorship Markup connects the web searchers with the author.
  9. Authorship should be used on pages which have articles and insights about a certain topic .
  10. If you have multiple authors listed for an article only one author pic. Will be displayed in SERPs as Google’s search UI currently does not support multiple authorship.
  11. Authorship needs to be verified via an email on the domain or by linking to the Google+ profile of the author.
  12. Under the contributor links of Google+ link to sites and blogs not to individual pages.
  13. Author byline is another important authorship indicator.
  14. Regarding the distinction for the author and publisher rel=author is for the person and rel=publisher is for the organization.
  15. Have rel=author, rel=publisher and organization markup should implemented on the relevant pages.
  16. Danny Sullivan’s take on Hummingbird(the latest algorithmic update).. Hummingbird is all about tying together entities and bringing entity search to the next level. It isn’t a “we don’t use links anymore” algorithm, it’s more of a “we’re still looking at all these signals, but we’re also looking at more streamlined, refined signals” .
  17. Stop worrying about how search algorithms are changing and focus attention on how user behaviour is changing. Searchers are getting smart and kids are way over facebook.
  18. Search engines need to understand and decode what gestures and conversational queries actually mean.
  19. According to Duane Forrester of Bing the future search is like ... Someone has an iPad, they look at a picture of the Empire State Building, they tap the picture – the speak and ask “what can I drink there?”
  20. Search engines need to make connections, understand gestures, context and demand and Entity Search is headed in that direction. So, its not about Bing v/s Google or Google v/s Bing. Often both engines have to be solving for the same problems, so whatever path they follow the destination is the same... “ To meet the demand of the user”.
  21. According to Greg Boser, building quality content for your audience and multiplying that audience by good social strategy is the key. If you have an audience then you have links too. Audience is everything. Leverage mobile , do responsive design and also offer an App. This connects you directly to your audience. In order to survive its not only about the white box (Google). According to Danny Sullivan a Search Marketer is a person who keeps working on trying to understand how the information is being searched.

SEO in 2014  is very much here to stay and by no means is SEO dead but it is all about Content, Context, Connections and Correlation Search marketers need to move away from keywords, rankings and links. If you mange SEO in-house then Distribute SEO tasks , create an SEO education plan and expand  SEO beyond the SEO team as Social media is not 100% of one person's job but its 1% of everyones job.

Previous Articles written by Bharati Ahuja on the above topics:

The Indian E-commerce Industry Emerging Trends At a Glance

The Indian Ecommerce Industry today maybe in nascent stage today but has the positive potential of an amplified growth in the near future.It is changing the way businesses are done and sowing the seeds for a whole new economy. Rising incomes and a greater variety of goods and services that can be bought over the internet is making buying online more attractive and convenient for consumers all over the country. The internet is opening up new avenues for the people in rural India which is changing the lifestyles, the quantity and quality of consumption which is helping change mindsets in the rural areas.

A quick glance at the eBay India Census 2012 report indicates that e-commerce is here to stay. eBay Census 2012 research findings were based on an analysis of all online buying and selling transactions by Indians on eBay between July 1, 2011 and December 31, 2012. At eBay India today, every one minute a mobile accessory changes hands and every two minutes a mobile handset. It has 4,306 e-commerce hubs today, of which 1,015 are hubs in rural India, indicating that consumers in the smaller towns of India are a major force in this online growth story.

According to IBEF (Indian Brand And Equity Foundation) The sector is classified into four major types, based on the parties involved in the transactions –

  • Business-to-Business (B2B)
  • Business-to-Customer (B2C)
  • Customer-to-Business (C2B) and
  • Customer-to-Customer (C2C)

According to an Internet and Mobile Association of India (IAMAI) report, the overall e-commerce market in India has recorded a robust CAGR (Componded Annual Growth Rate) of 54.6 per cent and crossed USD10.0 billion during 2007–11. It is estimated to add another USD4 billion and reach USD14 billion by end-2012. Segment-wise, B2C dominated the sector with a 56.0 per cent share in 2010–11. Together, the B2C-C2C segments have shown significant growth; their aggregate market size stood at USD9.9 billion in 2011, while that for B2B segment was estimated at around USD48.8million. However, B2B’s acceptance is on an upward trend due to its rising awareness amongst Small and Medium Enterprises (SMEs), which are close to 13 million in number.

The United Nations estimates that just 31% of India’s population lives in urban areas, compared to 50% in China, 74% in Russia and 85% in Brazil. As a result, sales from consumers outside of metropolitan areas already make up half of total sales at some leading online retailers in India.

Flipkart today is valued at Rs1,000 crore, and speculation is rife that the e-firm, in fact, is worth at Rs5,000 crore. It recently raised $20 million. Other Indian players – Myntra, Jabong, India Plaza and Junglee – are also eating into the e-commerce pie.

Notably as can be seen in the image below, on the investment front, the sector enjoyed inflow of around USD800 million in 2011, up from USD110 million in 2010. Investments made in e-commerce businesses by PE firms alone more than quadrupled to USD467 million in 2011 compared to USD99 million in 2010. The number of deals increased to 78 compared to just 22 in 2010. The robust deal activity continued in 2012, with USD242 million invested during the January-April period. The trend over the period reflects that the average deal size has more than doubled due to increasing traction in e-commerce activities, which requires larger investments for growth.

However, the major issues which the ecommerce companies face are as follows:

1. The challenge of having a good distribution network but many ecommerce giants are developing their own distribution networks as delivering charges form a major component of the product cost – ranging between USD1.0–4.0 per item.

2. The huge investment made in ecommerce ventures has the potential to give beneficial returns in the long run only if the customer base increases.

3. The turn around time for delivering a product can be reduced to 1-2 days only if the distribution system is improved and this also will help retain old customers and add to the list of new customers.

4. Retaining an existing customer is more profitable for a company since acquiring a new customer; on average costs USD15.0–20.0.

5. Only the ones who can face these issues and retain customers and survive online can get a piece of the pie of the online market as fundamentally weaker companies would lose out to established players . Hence a focus on improving the distribution system and retaining the existing customers is the key to ecommerce success.

On an average day on eBay in India:

According to a study by Forrester , India is expected to record the highest growth in the Asia Pacific region during 2012–16. The trend would shift with the online retail segment contributing equally to the total market size, considering it is expected to grow significantly in the coming years. The B2C segment would continue to lead the e-commerce market, thanks to the budding Indian Internet population, supporting demographics, ease of payment modes and customer-centric innovative policies.

Google+ Author Attribution And WordPress Sign-In

We all by now know that by adding the Authorship Markup to our blog we can have the author pic. Displayed in SERPs in Google. Adding the Authorship Markup is complete and possible only if the author has a Google+ profile and it has a two way link from the content to the Google+ profile of the author.

The author pic. being displayed in SERPs is surely a good thing. It correlates the content with the author and givs him an author identity and also people start recognizing the author not only by his writing but by face too.

Google announced yesterday that Author Attribution can also be possible if you sign-in to WordPress with Google and the articles you publish will now be associated with your Google+ profile automatically. Google also adds... With this association in place, we can look for ways to surface your info when it's most relevant. For example, today users may see your name, picture and/or a link to your Google+ profile when your content appears in Search, News and other Google products.

With majority of the blogs on the WordPress Platform , Google+ is surely going the right way to gain popularity . Google has integrated this with two major platforms today: WordPress and Typepad. Google is also working with a variety of other sites — including About.com, WikiHow, and Examiner — to learn and expand the pilot to all kinds of sites and apps using Google+ Sign-In.

The Android Ladoo Campaign - Lets Make Android Delicious

Android is the operating system by Google powers over 1 billion smartphones and tablets. Google says, since these devices make our lives so sweet, each Android version is named after a dessert: Cupcake, Donut, Eclair, Froyo, Gingerbread, Honeycomb, Ice Cream Sandwich, and Jelly Bean and the latest version of Android is named KitKat.

Android KitKat Android KitKat

 

Indian developers have come up with a suggestion for the next version. they have suggested that the next version of Android should be named 'Ladoo'. Ladoo is a round shaped sweet made of flour, sugar and clarified butter and is served on festive occasions and religious ceremonies.

Android Ladoo

This group of developers wanted the KitKat version to be named as KajuKatli  but as they were late in putting forward the suggestion they have started a campaign to call the next version as Ladoo.

If you support this suggestion then visit http://ladoo.co/ ... Lets Make Android Delicious 

 

Enhanced by Zemanta

Google Wants Feedback About Small But High-Quality Websites That Could Do Better In Search Results

Matt Cutts the head of the spam team at Google had recently tweeted  and asked small website owners to submit their site via a Google Docs form

 

On the form Google has mentioned....

“Google would like to hear feedback about small but high-quality websites that could do better in our search results. To be clear, we're just collecting feedback at this point; for example, don't expect this survey to affect any site's ranking.”

Usually the big brands have an edge over the websites of small companies mainly because of the extensive web presence and more number of inbound links due to their popularity.

In reply to one of the tweets @mattcutts replied ...” We're looking for feedback from a wider circle of folks so we can assess the scope of things.” He has also further mentioned ... This is a separate attempt to see if there's small, good sites that should do better

In reply to @robdwoods tweet on the 29th of August 2013 it seems only few have filled the submission form and Google wants more data to come to conclusions...

Google’s New "In-Depth Articles" Feature For SERPs And How To Optimize Your Site For It

Google rolled out the “in-depth articles” feature for search results last week. Google says .... to understand a broad topic, sometimes people need more than a quick answer and research indicates that atleast 10% of people’s daily information needs fit this category -- topics like stem cell research, happiness, and love, to name just a few.

In the coming few days when this feature gets rolled out for main search results you will see the results as follows if your search query is about a broad topic. This will be rolled out on google.com in English to start off with.

In-depth Articles feature of Google For Search Results

Google also added on the Google search blog that they are happy to see that people continue to invest in thoughtful in-depth content that will remain relevant for months or even years after publication. This is exactly what you'll find in the new feature. In addition to well-known publishers, you'll also find some great articles from lesser-known publications and blogs.

This feature surely focuses on the fact that how serious Google is about quality content which is timeless and offers a lot of quality information to the user about a given topic

How To Optimize your site for the "In-depth articles" Feature:

Along with the meta data on the page and other on-page SEO factors if you focus on the following aspects in case of articles which are written in detail then the potential for that article to appear under the “in-depth articles” increases.

1.Schema.org Article Markup

Specify the following attributes of the article markup using the Article Markup Schema:

  • headline
  • alternativeHeadline
  • image (note: the image must be crawlable and indexable)
  • description
  • datePublished
  • articleBody

2. Authorship markup

Authorship markup specifies the author and helps Google to display the details of the authors and experts in the search results.

3. Pagination and canonicalization

It is but natural that the in-depth articles will be lengthy and can be multi-part . In such cases the proper pagination markup using rel=next and rel=prev can help Google algorithms correctly identify the extent of such articles. In addition, it’s important that canonicalization is done correctly, with a rel=canonical pointing at either each individual page, or a “view-all” page (and not to page 1 of a multi-part series).

4. Logos

Hence,

5. Restrictions on Content Accessibility by users:

If readers need to login to read the article and must register to access it then it may not be possible for Google to crawl and index the article . If Google can't properly crawl and index your content (including text, images and videos) then Google will not be able to show it in the search results (including the "In-depth articles" feature). Implementing First Click Free is one easy way to make sure your content is accessible to Google's search crawlers so it can be displayed in Google search results.

In my opinion if your blog has many such in-depth articles which appear in such results I am sure it will help your blog to become an authority blog on the related topic and establish the thought leadership of the author.

This is a great effort by Google to separate the grain from the chaff especially in this era where after the link spam methods the content spam is on the rise.

SEO Ahmedabad

Contact Info

802, Astron Tech Park, Satellite Road, Opp. Gulmohar Park Mall, Ahmedabad 380015, India

+91 9825025904
info@webpro.in

Daily: 9:00 am - 6:00 pm
Sunday: Closed

Copyright 2023 WebPro Technologies LLP ©  All Rights Reserved