Matt Cutts in early June at SMX Advanced discussed the importance of links vs the social signals for the site.from the SEO and search engine presence perspective.
The video below shows Danny Sullivan and Matt Cutts discussion on this topic.
Matt Cutts in early June at SMX Advanced discussed the importance of links vs the social signals for the site.from the SEO and search engine presence perspective.
The video below shows Danny Sullivan and Matt Cutts discussion on this topic.
The SEO Cocktail |
Content Marketing advocates the belief that if you share valuable, relevant , analytic and insightful content in various forms then you tend to correlate your persona on the web with the shared content and thereby the industry related to it. Digital content can be in the form of blog posts, images, videos, comments, reviews, podcasts, webinars, etc.
Sharing so much content as a natural offshoot generates responses which can be an appreciation, acknowledgement or a total rejection in the form of harsh criticism and negative feedback. Everyone can face the appreciation and accept the bouquets but we need to sensibly and gracefully handle the virtual brickbats as well.
As your blog gets more readership and as more and more people acknowledge your online presence on various platforms a healthy mix of reactions are generated online and facing them confidently and gracefully adds up to the authority, trust and if I may add the character to your online persona. Not everyone is going to agree with the views that one expresses in the content shared or published by you. If everyone agrees, then it is a matter of worrisome concern.
The first and foremost rule to take negative feedback constructively is “Never Take It Personally” as most of the time the person who has expressed the negative feedback in the form of comments or reviews does not know you personally and it is his response from what he understood by reading what you have written. Hence, it is a personal opinion and a perception and not a direct attack to you as a person.
Negative feedback can be a genuine difference of opinion a person has or can arise due to malicious intent too. But a negative feedback can be a powerful tool to establish online authority and gain customer loyalty if handled in an unbiased manner.
Some Points To Ponder:
1. Accept the fact that negative comments and feedback are bound to come your way if you are honest enough in expressing your thoughts and opinions.
2. You can accept positive comments all the time if you are a part of a mutual admiration community and not a part of the web community as a whole. In such a case you may get to read positive words about your product, company and yourself but you develop a negative online persona in the long run.
3. Standing by your thoughts, opinions and beliefs help you establish a unique online persona with an aura of self-confidence and uniqueness.
4. Never take them personally.
5. Acknowledge the feedback and reply accordingly.
6. A conversation thus generated helps to put forward your point of views and beliefs which in a way can earn you the authority and the trust factor.
7. Read the comment carefully, gather all the details and discuss further.
8. If need be you can take the matter offline too by discussing via phone .
9. Usually the negative feedback is from a dissatisfied or misinformed customer or a jealous competitor . In both the cases communicating with a right attitude helps solve matters.
10. Surf the web and see how others have handled such situations.
11. Every experience enriches you with knowledge. Make the most of it rather than shying away from it as it can be a golden opportunity in disguise.
12. Accepting an error or mistake made is a sign of strength and if you are right then standing by what you have said reflects the same strength of character.
Rejection and getting negative feedback is a part and parcel of content marketing . Learning how to deal with it having the right spirit helps you mature as an netizen and can put you in a win-win situation.
If the negative feedback comes from unknown sources or from the same sources repeatedly you can be rest assured that that you’re doing very well and some people just can’t take it with the true spirit of sportsmanship! - Be a smiling loser and a humble winner. When a winner makes a mistake, he says "My fault", when a loser makes a mistake, he throws the blame on someone else.
Here's a video of a recent hangout that @avinash did with +Think with Google team. The topics discussed covered were:
And so much more....
The Key Takeaways From The Hangout:
· The distance between its company and its brand getting destroyed is 2 pixels.
· Focus on sharing knowledge and information on social media which will add value to the lives of the people
· Understand the power of the medium
· Until now the brands have used the broadcast aspect of social and not the engagement aspect and when they fail to understand the engagement aspect it turns out to be an amplification nightmare
· It is like waking up every morning and walking the talk
· Every employee is a brand ambassador and the social media reflects the culture of the company
· In order to have control on the social media sharing on behalf of the company there can be a rule book or broad guidelines given to the employees who share on behalf of the company but have faith and assume intelligence on the other end.
· The rule book and guidelines to be broad parameters rather that the strict dos and donts.
· Review these parameters every 6 months and offer training to make the social engagement and amplification beneficial for the company
· Doing social effectively takes work and a shift in mind set and a new culture
· Companies using social effectively Cadbury’s, Esquire Magazine, BMW.
· People those who are going to get into social now are going to get a great advantage in future
Related Posts:
http://www.searchenginejournal.com/the-traditional-media-of-marketing-to-the-internet-media-of-marketing/23947/
http://blog.webpro.in/2011/07/setting-purpose-of-your-social-media.html
http://blog.webpro.in/2010/08/just-seo-is-not-enough-you-need-web.html
knowledge remix blog as a graph (Photo credit: g.janssen) |
The Wall Street Journal Post sometime back mentioned in the post that ….” Google is aiming to provide more relevant results by incorporating technology called "semantic search," which refers to the process of understanding the actual meaning of words.” Related to that we had posted our views on http://blog.webpro.in/2012/03/my-views-on-wsj-post-google-gives.html .
After the Panda and Penguin updates, Google on 16th May 2012 announced its Knowledge Graph on a blogpost titled , “Introducing the Knowledge Graph: things, not strings”.
In this article it is explained that :
The Knowledge Graph enables you to search for things, people or places that Google knows about—landmarks, celebrities, cities, sports teams, buildings, geographical features, movies, celestial objects, works of art and more—and instantly get information that’s relevant to your query. This is a critical first step towards building the next generation of search, which taps into the collective intelligence of the web and understands the world a bit more like people do.
Google’s Knowledge Graph isn’t just rooted in public sources such as Freebase, Wikipedia and the CIA World Factbook. It’s also augmented at a much larger scale—because we’re focused on comprehensive breadth and depth.
This is the first step to try to convert an information based engine to a knowledge based engine. This kind of engine is solely based on data and data relationships. Whatever useful information people are searching for offers valuable data for the knowledge graph which can be helpful to others searching on the same topic later.
The search queries help Google to add and correlate data thereby enriching the knowledge based search engine results. The information based search engine takes the string of words from the search query and maps it with the content indexed in the database and displays the sites responding strongly to the ranking factors in the algorithms in the SERPs.
In case of knowledge based search engine an additional dimension of the knowledge factor is added which is determined by the correlation, the relationship and how the content of the site is inter-linked with other entities on the web. This is one way the Knowledge Graph makes Google Search more intelligent. The results are more relevant because the search engine understands these entities, and the nuances in their meaning, the way you do. This makes the search engine think more like the user.
Google says…
“We’ve always believed that the perfect search engine should understand exactly what you mean and give you back exactly what you want. And we can now sometimes help answer your next question before you’ve asked it, because the facts we show are informed by what other people have searched for. “
How does one optimize a site for such a knowledge based search engine?
· Have loads of relevant knowledge rich content in all forms on your site.
· Share that content all over the web so that it gets correlated to the relevant topics and the relevant industry
· Every piece of content on the web is data . How well this data gets indexed and how well It get interlinked is important.
· Represent the following content on the site as microformats or as data :
Reviews
People
Products
Businesses and organizations
Recipes
Events
Video
To check your markup, use the rich snippets testing tool.
· Aim at establishing an authority and brand on social media
· Participate in discussions to put forward your opinion
· Have an active Google+ business page presence
· Use Google+ as a blogging platform
· Do not focus on Guest blogging and blog commenting for link building but post comments and guest blog for sharing your knowledge
· Forget about keywords and focus on the keyness
· Implement the hCard Code for the address on the contact page
· Do not neglect the On Page Optimization as the knowledge quotient will be applicable only if the relevant pages respond to the basic ranking factors.
· Keep improving the Technical aspect of SEO in response to the data received from Webmaster Tools
We all know that Google is all out to fight spam and is aggressively working for improving the quality of search results like never before. Since, Feb. 2011 we have had a series of Panda updates and now the Penguin Update is the latest antidote added to the search algorithm for devaluing the sites which adopt manipulative methods and spam inbound links to deceive Google to get high rankings.In simple words I can say that Google is penalizing sites which have broken their rules and guidelines. Google has always considered SEO as a positive and constructive action for achieving good search visibility. According to Google SEO IS NOT SPAM . But, there is a very thin line between SEO which is positive and constructive and SEO which uses manipulative methods to achieve the desired result. It is usually referred to as the white hat v/s black hat SEO strategies.
White Hat Methods for SEO are very simple and straight forward, Black Hat techniques on the other hand are, the ones which are complicated and understood only by people who adopt these techniques as they are not the standard norm.
Post Penguin update the SEO blogosphere is filled with posts on Over Optimization, Link Pruning, Natural way of acquiring links, etc.
But, first of all ,'To Optimize' means to achieve a perfect balance – nothing more and nothing less. According to the Merriam-Webster dictionary, to optimize is to make as perfect, effective, or functional as possible. In engineering, optimization is a collection of methods and techniques to design and make use of engineering systems as perfectly as possible with respect to specific parameters. So, if you have been optimizing sites as per the Google guidelines and have been adopting the right white hat off page and on page techniques then I don’t think this kind of an update should worry you as an SEO and let all the pandas and the penguins play to the glory of Google.
Antarctic: Signy Island - Adelie penguins (Photo credit: ¡WOUW!) |
Google is becoming more and more efficient in detecting low quality content , duplicate content , innumerable low quality inbound links acquired with unfair means, doorway pages, keyword stuffing and non-reasonable use of footer links with anchor text stuffed with keywords to manipulate and misguide Googlebot. But, again if you have not been doing this in the name of SEO then there is no need to worry.
The Penguin Update if there is any (or just a false alarm by Google to instill fear and initiate self correction) then it is basically making the SEOs who have been following the above methods go crazy as it is affecting their guilt conscience and making them go for a rectification spree else I think the others do not have to bother at all. We have not made any changes on the sites managed by us and practically 90% of our sites have shown an increase in targeted traffic and improved rankings from 29th April 2012.
I believe these series of the Panda and Penguin updates will still take atleast another 6 more months to effectively churn out stable quality results and penalize sites using manipulative methods to deceive the Googlebot.
Many people have been writing about how SEO should be an invisible layer which should enhance the search visibility of the site but should not show on the site. I think SEO is that technical layer which is applied on the site to make the search engine bots read and index what your visitors are reading. Your site communicates with the search engine bots not only via the HTML tags in the <Head> and <Body> section but via other files too like the robots.txt, sitemap.xml, .htaccess, HTTP headers, etc. and the medium of instruction used is the search engine web master tools available for that. Hence, the off page optimization also includes how well you communicate with the search engine via these tools. Even the analytics used by the search engine to give you reports of the progress of your site gives a lot of information to the search engine regarding some quality metrics. As in the long run once the on page optimization is set up and if you focus on the natural way of building off-page SEO, you will see that the better your site performs and the online sales, likes, +1s and links from your site get shared on social media the site gets an automatic momentum as all these metrics and signals boost the potential energy of the site which in a way gets reflected in the kinetic energy flow in SERPs.
The initial on page and other SEO techniques give the initial push to the site but for the site to go on doing well, the users have to show that they like to be on the site and complete the call to action of the pages on the site. For instance if it is an ecommerce site then the online sales show and increase and there are more and more repeat visitors buying online along with the new visitors then for sure your site is going to get a boost from the search engines as this proves that if the visitors can trust you with their credit cards then the trust factor of the site is high.
I think if we optimize the site and update the site with great content continuously and share it on the web and make it capable of being shared further by others then the SEO is like the initial push that a body in the state of rest gets and keeps on going ahead unless it is faced by the unbalanced force. The Panda and the Penguin updates are that unbalanced force which are applied by Google to stop the momentum and motion of the sites using unfair means and not to the sites which are optimized according to their guidelines.
You have to learn the rules of the game. And then you have to play better than anyone else. - Albert Einstein
I am sure we all in the web arena have posted comments on blogs, written blog posts or shared and had conversations on social media sites either at a personal level or at a professional level. Lately, I have been in the retro mood since I started working on server logs for SEO analysis.
I think it is a good thing to rewind the tape of life sometimes and get some insight about our past actions and clear our mind to plan for the future. Hence I decided to focus on the foot prints created by WebPro Technologies by way of blog commenting in the past. This is not a client report so there are no metrics to monitor but a true self analysis to be done to know what was discussed and what was quoted by us on other blogs and social media sites and how much has been adhered to, by us . This also gives us a chance to analyze what our views had been in the past as compared to the present changing scenario.
Post Panda and Penguin updates there are umpteen no. of posts on natural links and link pruning but we have always been of the opinion that the term “Link Buiding” itself is wrong you do not build natural links, they get built in the process of the quality footprints you make on the sands of the web during your web journey.
One of the ways to judge the knowledge and ideologies of an SEO company is to read about what they have written in the past by way of comments, blog posts, social media conversations, reviews and on forum discussions about various topics and issues and see if they still have the consistency in what they say and how those opinions in the past have shaped up in the real world as of today.
After all the words written on the canvas of the web on various platforms are not mere words but content in different forms around which the whole web world revolves. Authentic, valuable content should stand the test of time and add value to the authority factor of the author. I have been commenting on and discussing many topics related to the search industry and have also been sharing links related to posts written by me if there was a strong correlation with the blog topic and never bothered if there was a 'no follow' attribute on the comment links or if I got any amount of thumbs down for it. As my main purpose was to put forward an opinion which I strongly believed in or felt about.
The past content posted by a person can say a lot about the knowledge, beliefs and long term goals about that persona. This can reflect the solidarity of the viewpoints made by that person and if the present scenario can tell us if they have adhered to that and stood by what they had said and also proves that if they have any coherence in what they say and what they do. This kind of check can also make people think before they post and becomes a self check for ensuring that quality content is published on the web.
Our business associate @wasimalrayes suggested that this could be a comment archive which can be added to the site and updated with every comment made on the web. I think it is a good idea which not only adds your web voice to your site but also acts as a self check tool for responsible content addition to the WWW.
We all have a blog archive, why not a coment archive too as after all comments also are mini blog posts posted by us on the web which reflect our opinion and perspective regarding that relevant topic?
What do you think about the ‘Comment Archive ‘ section being added to the site?
We would like to share some of the past comments , blog posts and social media conversations we have had on the web regarding various topics. Since the blogosphere is brimming with the posts regarding the Penguin and the Panda updates I’ll start with Link Building:
Topic Link Building :
Some Blog Comments Made By Us In The Past Reflecting Our Views On Link Building:
Our Comment
WebProTechnologies | January 3rd, 2012
All the predictions for this year are spot on. I agree to all the points predicted.
Regarding #3 I think Google might just give us a surprise this year by giving less importance to inbound links . Only the links which will come from trusted and high authority sites and editorial links will matter and will be taken into account. The major focus will be on the social media signals which will reflect the trust and the authority factor. Hence, in what context the links are being shared on social media and the discussions and reactions surrounding it will make a big impact.
Hence, stop the link building nuisance and focus on building quality content (in all forms, images, text, video, audio, etc. and share it on social media) and let the natural links get built...
Comment :
February 20, 2010 at 10:24 am
Totally agree.
In My opinion everybody has just gone too far thinking only about how to get more and more links. I am sure when the PageRank concept must have been framed, the main purpose must have been to judge the true goodwill and popularity of the website in direct proportion to the no. of inbound links it has.
But with all these ethical and unethical methods of gaining more and more links the whole purpose is defeated.
If the site is having good informative content and with ethical SEO practices it ranks high in the search engines then it automatically gets a lot of links from various sources.
As the main purpose of a genuine searcher is to search for what is available globally and locally. Once the searcher finds that it surely gets added and linked by him in various ways.
Instead all the energies and efforts should be concentrated on building the website qualitatively in various ways by adding more varied content.
Don’t run after links. Let them come to your website genuinely.
Our Comment:
WebProTechnologies | May 21st, 2010
Aptly put at the very begining of this post that link building is a task which is detested by all .
I am of the opinion that the term 'link building' itself is an incorrect term. Links do not have to be built but they should get built naturally in the process as your website starts getting a wider web presence and preference.
As every link is like a vote to your site and goodwill of your company and that has to be earned as part of the web journey of the website.
If we focus on the quality content, have a good site internal linking architecture, have a site which is visitor friendly as well as robot friendly then getting high SERPs is not a difficult task.
Once you have high SERPs trust me there will loads of directories and portals adding your site in their listings even without you knowing about it, as they too are looking for quality listings.
Once upon a time the dmoz listing was something that you always wished for once you submitted your site in dmoz as that surely was a valuable link. I dont know if it still has that importance but I still manually add each site to dmoz.
Apart from a good qualitative site in all respects other genuine methods of gaining natural inbound links as your website goes from one milestone to another are as follows:
Focus all your efforts on making the site informative, qualitative and content rich to get links automatically.·
Do not neglect the On-Page Optimization Basics and just go after links. (Very important from the SEO perspective)·
Participate in social media networks for discussions and sharing of information and mention links to the relevant pages to your website. (It need not be the Home Page always)·
Have a social book marking button on your website.
Make RSS feeds available on your website.
Issue Press Releases periodically.
Our Comment:
WebProTechnologies | May 31st, 2011
Well, despite all the thumbs down my opinion still remains the same. Your quality content on your website and quality web presence on all the search options, blogs, discussions, social media, etc. will always be rewarded in an increasing manner in the long run by any search engine and will result to inbound targeted traffic.
As we do a fairly good job on SEO and rankings without focusing on link building but in the process educate and train our clients to effectively maintain their blogs, and social media accounts and in the bargain they end up getting quality links and it has worked for us.
Our Archived Blog Posts On Link Building:
Topic 2 Social And Search Integration:
Blog Article / Social Media Post
Our Comment:
Yes initially Altavista was THE SEARCH ENGINE and keyword spam was something that Google had to work on to improve the quality of search results for which they came up with the PageRank Technology to add value and quality to search results.
But as every coin has 2 sides this innovation also gave birth to the link building spam and despite the improvement in the search results which established Google as the top most search engine, it polluted the web with unnecessary content clutter.
But as people kept on flocking the social media sites the search engines thought of using the public opinion as the criteria for quality and word of mouth. How well the search engines will integrate the social media signals only time will tell.
But it is for sure that this will ensure more genuineness as you cannot manipulate public opinion. SEO is what you say about your company social media is what others say about your company. When both these messages are in sync a credibility is established. Hence the authority, credibility, WOM and an overall presence is the demand of the day for true SEO , which in the long run will ensure natural and quality inbound links on its own.
So first work on content, establish an identity, authority and an online credibility and then the links will follow. And if we go to see that was the main goal of the PageRank technology to check how many people vouch for a certain page content but with link spam it got negated . Now with social media signals and focus on quality content via the Panda Update this will surely be taken care off to a great extent.
The best way to achieve great online presence will be to have an equally great offline and real time business presence :http://blog.webpro.in/2011/10/best-way-to-assure-quality-content-is.html
I will not be surprised if in the coming year the blogosphere gets bombarded with blogpost meteors on "THE DEATH OF THE SPAMMY LINK BUILDING INDUSTRY" instead of SEO being dead.Blog Article / Social Media Post
Our Comment:
Bharati Ahuja 11th January 2012
I think the blending of social results in search is not only the inevitable evolution of search but the reflection of what took place when civilizations evolved. We can just say that the stone age of search is over and now search even has the ability to reflect what people in your community are talking about and recommending. It is basic human nature to search for a want and then discuss with peers about their opinions and then take a decision. Since ages we have been doing this but now we have to just adapt ourselves to the virtual world for this kind of an action.
To a certain extent I believe that if Google wants to improve the quality of search results and combat the spam on the web then yes, it is highly essential that the search engine can access data from a resource it has full control on. But, from the search engine perspective only time will tell how well Google succeeds in integrating the social signals from other social media sites from all over the web else with the kind of hold Google has over the search market it is going to be, Google Google all the way…
But its surely not the end of SEO. In fact all these changes are taking SEO to a more qualitative level.
Archived Blog Posts On Our Blog:
http://blog.webpro.in/2011/02/integration-of-social-and-search.html
http://blog.webpro.in/2010/06/search-seo-and-social-media-integration.html
Topic 3 “Not Provided Keyword Data”
Our Guest Post On The Topic:
Archived Blog Posts On Our Blog:
http://blog.webpro.in/2011/11/search-queries-googles-encrypted-not.htmlU
Blog Article / Social Media Post
IMHO especially with regard to Keyword Referrer Data:
2011 was a year of changes and I think it is a period of transition to a better web and better search results as SEO is much beyond keywords and rankings.
When the businesses are at a loss for the complete keyword data the focus is shifted to the search queries in WMT which have a good CTR which is a true measure of quality over quantity.
This restriction makes the website owner think from a larger perspective and focus on the correlation of content and keywords rather than rankings. This will take SEO campaigns above the metrics of keywords and rankings and the focus will be on other quality metrics like CTR , conversions, bounce rate, etc. which will improve the quality of the web overall as the websites besides being rich in content will have to focus on good landing pages, a proper call to action, page load speed and good navigation which will ensure a better UX .
This lack of data will draw the line of distinction between a PPC campaign and a SEO campaign. The quality metrics will be CR and the CTR which again will make the client focus on content and the landing page design which will again be a quality step towards a better web world rather that discussing about keywords the client will be open to discuss about content and design.
Have shared my views also on http://blog.webpro.in/2011/11/search-queries-googles-encrypted-not.html
Server Logs (Photo credit: novas0x2a) |
Data, Data From Everywhere On The Server And Not A Byte To Benefit From....
All those who are in the web solutions business since early 2000 know that prior to Google Analytics , the most trusted analytics data was the log files on the server. In fact those log files are still in fact the most accurate and raw data available for the actual activity taking place on the server.
Server logs are automatically created recording the activity on the server and they are saved as a log file on the server itself. Usually the log file is saved as a standardized text file but it may vary at times depending on the server. Log files can be used as a handy tool for web masters, SEOs and administrators. They record each activity on the server and offer details about – what happened, when and from where on the server related to that domain. This information can record faults and help their diagnosis. It can identify security breaches and other computer misuse. It can be used for auditing and accounting purposes too.
A plain text format minimizes dependency and assists logging at all phases . There are many ways to structure this data for analysis, for example storing it in a relational database would force the data into a query-able format. However, it would also make it more difficult to retrieve if the computer crashed, and logging would not be available unless the database was available.The W3C maintains a standard format for web server log files, but other proprietary formats exist. Different servers have different log formats. Nevertheless, the information available is very much the same. For example the fields available are as follows: ( It may not be necessarily recorded in the same order on all servers)
· IP address
· Remote log name
· Authenticated user name : Only available when accessing content which is password protected by web server authenticate system.
· Timestamp
· Access request : "GET / HTTP/1.1"
· The request made. In this case it was a "GET" request (i.e. "show me the page") for the file "/" (homepage) using the "HTTP/1.1" protocol.
· Detail information about HTTP protocol is available inhttp://en.wikipedia.org/wiki/HTTP.
· Result status code : "200"
· The resulting status code. "200" is success. This tells you whether the request was successful or not.
· For a list of possible codes, visit http://en.wikipedia.org/wiki/List_of_HTTP_status_codes.
· Bytes transferred : "10801"
· The number of bytes transferred. This tells you how many bytes were transferred to the user, i.e. the bandwidth used. In this case the home page file is 10801 bytes, or about 10K.
· Referrer URL
· User Agent
Following is the example of the data which was exported to Excel from the log file:
Example 1:
180.76.6.233 - - [29/Apr/2012:05:04:56 +0100] "GET /blog/microsoft-windows-vista-ultimate-with-sp2-64bit-oem/ HTTP/1.1" 404 39621 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"
Example 2:
On some servers the fields will be mentioned in the log file before recording the data as follows and then the corresponding data for that date will be displayed:
#Fields Per Record: date time cs-method cs-uri-stem cs-username c-ip cs-version cs(User-Agent) cs(Referer) sc-status sc-bytes
Data Per Record: 2012-05-01 01:19:17 GET /seo-web-design.htm - 207.46.204.233 HTTP/1.1 Mozilla/5.0+(compatible;+bingbot/2.0;++http://www.bing.com/bingbot.htm) - 200 12288
Well, it is not as geeky as it looks , in fact it is very smiple. The data from the log files can be retrieved easily by importing the text data in Excel or by using standard software available for extracting data from log file like the “WebLog Expert” the sample report generated can be viewed on http://www.weblogexpert.com/sample/index.htm
The analysis of the log files can offer some great insights about the traffic on the server and many times the spam on the server and a hacking attack can be detected early and the harm on the sites can be reduced to a great extent as the corrective action can be taken immediately. This can be a real boon to every SEO as this data will be reflected in WMT much later.
The data can be filtered out as per the fields which need to be tracked. For example you can see in the image below how the WebLog Expert software shows the data graphically and numerically for the filtering that we did to trace the Google, Bing and Baidu bot activity on a particular domain .
Keeping a track of this data can give us information related to the crawling of the bots , downloads, spam attacks, etc. Of course , after all it is all raw data and just data in itself is meaningless but how you correlate and connect the dots to come to correct conclusions to take the right decisions is what makes the difference.
For me it is a Déjà vu feeling as when we did not have Google Analytics the server log files and Webalizer were the only resource. Sometimes, going retro is the coolest thing to do because, some trends which seem to be new are actually very old.
The top five entrepreneurs of 2011 truly demonstrated their ability to start profitable and sustainable businesses. The top five businesses and entrepreneurial founders are as follows:
cc licensed flickr photo shared by techatnyu
1. Rohit Arora: Biz2Credit
Rohit Arora, founder of Biz2Credit, capitalized on the banks’ ability to develop profitable portfolios with small business loans. He also noticed that ethnic-owned firms had the least defaults and used this information to develop his business model. Since immigrants’ largest impediment to getting loans in the United States was a result of lack of understanding, Biz2Credit developed an immigrant-friendly online application process for commercial loans and credit reports.
Once credit scores are determined, immigrants are matched with three to 10 potential lenders. Financial documents can be stored online until a loan is granted or denied. The process is streamlined and innovative. Immigrants can build their business credit and expand their businesses with these opportunities.
To date, this Manhattan firm has facilitated a total of $400 million in loans. The loan amounts are typically
between $200,000 and $4.5 million. Fifty-five percent of the Biz2Credit’s customers are immigrants.
2. Anthony Casalena: SquareSpace
Squarespace developed from Anthony Casalena’s frustration of integrating photos, analytics and blogging tools. He developed a publishing platform to solve the problem. The software integrates websites, Flickr and Twitter easily. He used a $30,000 investment to buy servers to support his platform.
The platform costs between $12 and $36 per month and supports clients such as Kiehl’s, a cosmetic retailers and Marc Ecko Enterprises, a fashion firm. In 2010, the company reported sales of as much as $10.2 million.
3. Michael Dorf: City Winery
City Winery founder, Michael Dorf, addresses the need of winemakers who need fresh grapes from California, Oregon or Chile vineyards. The company also hosts concerts, wine tastings and sells wine by the glass. The event space is used for wedding receptions and private parties. The main portion of his business sells memberships to produce wine for companies. In 2011, Dorf expected to produce $10 million in revenue.
4. Karen Grando: International Asbestos Removal
International Asbestos Removal founder, Karen Grando, went into the asbestos-removal field after hearing her husband repeatedly turn down clients requesting the service. She enrolled in a training program to earn city and state licenses to perform this trade. She has won contracts with Metropolitan Transportation Authority, Dormitory Authority of the State of New York and the School Construction Authority. In 2010, the firm had revenue of $10 million.
5. Michael Kirban: Vita Coco
Vita Coco was started by Michael Kirban and Ira Liran after over-hearing a conversation about coconut water from two Brazilian ladies. The founders felt coconut water could be as profitable as orange juice. The two guys used $100,000 of their own savings to start manufacturing in Brazil. The popular drink made its debut in Whole Foods in New York. Vita Coco is also sold at many Walmart and Costco locations.
Demi Moore and Matthew McConaughey both invested in the popular drink. Celebrities have donated over $10 million for its development. The business growth was only limited by the number of coconuts it could obtain. Last year, the company managed nearly $80 million in sales. The company is also charitable and donated $2 million to the American Heart Association, Special Olympics and other charities.
Guest Post: Robert Kurtz is a career consultant and contributor for Top Business Degrees, a site with detailed information, reviews and guides about online business degrees.
Augmented reality (AR) is one of the technologies gaining increasing interest. By mixing virtual with the real world in different proportions, augmented reality allows pulling graphics out of the television screen or computer display and integrating them into real-world environments.
A smartphone app called MagicPlan uses augmented reality to capture building topology. (Photo credit: Wikipedia) |
Augmented reality changes the way we view the world. The goal of augmented reality systems is to combine the interactive real world with an interactive computer-generated world in such a way that they appear as one environment.
Augmented reality blurs the line between the real world and the virtual world. The goal of augmented reality is to add information and meaning to a real object or place. It takes a real object or space as the foundation and incorporates technologies that add contextual data to deepen a person’s understanding of the subject.
The video below expalins how Augmented Reality will take shape in future:
>
AR is the computer vision for observing the world as it is as a scene on the computer or a smart phone. AR is more associated to local search currently but will soon move for ecommerce for 3D viewing of the product and animation.
The entertainment , education and the ecommerce industry will be the first ones to benefit from it. It currently is still being experimented on smart phones but has the potential of moving on to a device like glasses where it can be easily used for every query in the mind. By embracing AR we open up a new world where our mind is the only boundary.
802, Astron Tech Park, Satellite Road, Opp. Gulmohar Park Mall, Ahmedabad 380015, India
+91 9825025904
info@webpro.in
Daily: 9:00 am - 6:00 pm
Sunday: Closed
Copyright 2023 WebPro Technologies LLP © All Rights Reserved