SEO Web Development | Custom Web DesignWe offer affordable web design and e-commerce web development dedicated to provide web based solutions to small and medium sized business. Providing customers with customize web design and seo services professionally.

Sunday, May 31, 2009

What really is Page Rank? How important it is?

page rankPageRank, a link analysis algorithm used by Google search engine that assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references.

PageRank is Google's way of deciding a page's importance. It matters because it is one of the factors that determines a page's ranking in the search results. It isn't the only factor that Google uses to rank pages, but it is an important one.



From Google PageRank Technology:

  • PageRank Technology: PageRank reflects our view of the importance of web pages by considering more than 500 million variables and 2 billion terms. Pages that we believe are important pages receive a higher PageRank and are more likely to appear at the top of the search results.

    PageRank also considers the importance of each page that casts a vote, as votes from some pages are considered to have greater value, thus giving the linked page greater value. We have always taken a pragmatic approach to help improve search quality and create useful products, and our technology uses the collective intelligence of the web to determine a page's importance.


Google Page Rank is a factor that Google assigns to web pages in its index. It is based on the number of links Google has found pointing to a certain web page. Links can be from other web sites as well as from other pages in the same site. The rank varies from 0 to 10. 0 means less (or zero) number of incoming links and 10 means the highest number of incoming links.





PR is just one factor that Google looks at when they determine your site/blog’s SERPs. (View the above video about ranking and listen to 0:53 - 0:54 of Matt Cutt's says page rank included as a ranking factor.)

Another importance of PR in determining SERPs is backlinks (or how many links from other sites (and even within your own sites) point to your domain. In fact backlinks seem to have a more direct impact upon traffic levels. Google considers a link from another page as a vote for quality and PR is difficult to abuse than other ranking factors. And also gaining a PR takes time and it illustrates the establishment of a web site which is a good factor to consider when ranking web sites.

Many SEOs are very obsessed with this PR since I have read about news that through this PR they can earn(buying and selling links) and Google detects it. (Google Caught Selling High PageRank Links, Again & Again). They are making business with this green thing. I've been seo for 2 years and the fact that many seo expert (as they tell us) on the forums, "saying that PR is worthless". For me as long as Google uses it has still value.


Labels: ,

Saturday, May 23, 2009

Creating Better Links for Visitors to Click On

Links are the opening points of your site. If you do not have good links, people usually will not continue to browse your site.

Links are the very vital part of a Web page, when most people think of web pages, they also think of the content as the text that is written on the page. During many eyes tracking studies, it has been shown that people viewing web pages are drawn to images. But their eyes are also drawn to links. This is because, in most web pages, links are underlined and in a different color than the surrounding text. They stand out.

Links Make Content Scannable

When your readers come to a web page, they are trying to learn something or be entertained. But because most people do not read Web pages, then scan them. Good Web writing takes this into consideration, and makes the pages as scannable as possible. Because they stand out, links make a Web page more scannable. But you should make sure that the links you are using link to something related to the linked text.

Your Home Page Must Have Links

Unless your site is only one page, your home page will have links to other parts of your site. And most people will scan your home page more than they will read it, so you want to make the links as self-explanatory as possible. That's why web designers and seo go along when creating site with many pages.

Here is some tips on putting links when designing webpage:

* Avoid having a home page that is 100% links. This defeats the purpose of scannability as the links no longer stand out.
* Make the links pragmatic titles, not fancy. Do not make your readers guess where the link will go.
* Link to general pages that also have many links.

Labels: ,

Monday, May 18, 2009

The Basics of Web Designing

When using Graphics, think of the following tips:

#Think small, like 10-12KB per image. Depending on the source, the number of broadband users are going up. Slow pages are still really annoying, huge images are the primary cause of slow loading of webpages. Otimize your images.

# Always use graphics that will fit the content. Just because you have an adorable photo of your favorite picture, it does not mean you should have it on website about web design. The main exception to this is for "design" images. These are images or graphics that will help makes up the design of the page, and are not intended to illustrate the content.

# Do not use images that blink or move or change or rotate or flash or do anything on page. Or use them all over your site. There have been many studies that show that flashing graphics are very distracting and annoying to people. The fact that in one focus the users actually physically cover up flashing graphics to read the contents.


On Web Site Layout:

# Stick with standard layouts. Some pages use many frames on one page. Another site used a layout had to scroll to the right to read everything on the page. These layouts are great and designers find them fun to build, but they will drive the readers nuts. The reason that the 3-column layout is so popular on Web sites and newspapers is because it works. You might think its boring, but you will keep more readers if you stick with something simple that visitor can understand.

# Whitespace is more than the CSS property, it is a function of the layout. Designers should be aware of the whitespace on their pages: how it affects and how the content is viewed. Whitespace is just as important in a Web layout as it is in a paper layout.

# Use graphics as elements in your layouts. Graphics can be more than just graphics when to use them as actual elements in your layouts. An extreme example is when wrapping the text around an image, but any image on the site is a layout element and should be treated as such.


Regarding Fonts:

# Serif for headlines and Sans-Serif for text. Take any type of print design, this might be exactly the opposite of what you were taught. But the Web is not print. Sans-serif fonts are much easier to read on computer monitors because the screen resolution is not as high as in print. If you use serif fonts for normal text, the serifs can blur together on the screen making them hard to read. Your printer friendly page should use the opposite fonts (serif for headlines and sans-serif for text).

# Limit the number of different fonts. One best way to make your Web site look amateurish is to change the font over and over. Sure, it's possible to do, but limiting your page and site to two or possibly three standard font families are easier to read and looks more professional.

# Use standard font families. Choose "Rockwood LT Standard" as your font on your page, but the chances that one of your readers will have that font as well is pretty low. Sticking with fonts like Verdana, Geneva, Arial, and Helvetica may seem boring, but your pages will look better and the designs look correct on more browsers.


When Advertising:

# Do not be greedy. If you have any control over the number of ads on your site, be aware that your readers are not coming to read the ads, they are coming for the content. If the ads overwhelm the page content, many readers will not stick around long enough to read your purple prose. Yes, it's important to make money from your Web site, but if your ads drive people away, amd ultimately lose your money.

# Treat ads as you would any other image. Keep them small, avoid blinking/flashing, and keep them relevant. Just because you can have an ad on your site, does not mean that you should. If the content is relevant to your readers, they are more likely to click on the ad.


Keep in mind that websites are designed for Readers:

# Test your pages in multiple browsers. Writing Web pages that work only on the most modern browser is both stupid and annoying. Unless you are writing a Web site for a corporate intranet or a kiosk where the browser version is completely fixed, you will have problems with people not being able to view your pages.

# The same is true for operating systems. You cannot assume that just because your page works in IE8.0 for Windows it will work in IE8.0 for Macintosh.

# Write content that they want. Unless you are writing a site purely for yourself, make sure that your content covers topics that your readers want to read.

Labels: ,

Wednesday, May 13, 2009

Is Your Web Design Annoying?

Have you ever seen or visited websites that are annoying? When it’s comes to web designing know what visitors hate most, unless you don’t want them to visit you again. This article describes what you should do in your website.

Here are some pointers or few things that might be useful to some web designers and webmasters. Some of top 5 lists and decide for yourself whether you have been annoying your visitors.

* Popup windows – Even though popup windows are now blocked by many add on tools, webmasters keep using them. The annoying part of popup that is sometimes actually misses important information because of those anti popup tools. Don’t use pop up windows. Put your important messages in an essential place on your website.

* Huge font size – If you are designing a website for people with a disability then you are doing the right thing, but if not then you are shouting. It is like someone shouting at them that’s the huge font. People really don’t like it when someone shouts at them.

* Small font size – This is also annoying because you need a magnifying tool to view it, specially if it is not visible to naked eye.

* Overlapping layers – Layers can be very useful up to the point. But not when they are being used to put an annoying message in the visitor’s face. Don’t try to force your visitor to read your messages. Try persuasion instead of brute force.

* Background music – Unless you are operating an online internet radio station or sell music CDs, why play a midi/wav file in the background continuously on every page?

Many webmasters, particularly new webmasters are totally “in love” with their thoughts and be likely to go and enjoy with their web design in one way or another.

Labels: ,

Wednesday, May 6, 2009

SEO: Terms and Definition

WHAT IS SEO?

seo definitions Search engine optimization (SEO) is a set of methodologies aimed at improving the visibility of a website in search engine listings. The term also refers to an industry of consultants that carry out optimization projects on behalf of client sites. The process of improving web pages so that it ranks higher in search engine for targeted keywords with the ultimate goal of generating more revenue from the web site. There are many SEO techniques. In general, these techniques can be categorized as On-Page Optimization, On-Site Optimization, and Off-Site Optimization. Designing a website so that it ranks highly in the search engines when someone searches for specific phrases related to the site.



A site map (or sitemap) is a web page that lists the pages on a web site, typically organized in hierarchical fashion. This helps visitors and search engine bots find pages on the site. Using sitemaps has many benefits, not only easier navigation and better visibility by search engines. Sitemaps offer the opportunity to inform search engines immediately about any changes on your site. Of course, you cannot expect that search engines will rush right away to index your changed pages but certainly the changes will be indexed faster, compared to when you don't have a sitemap.


Cloaking - The hiding of page content. Involves providing one page for a search engine or directory and a different page for other user agents at the same URL. Legitimate method for stopping page thieves from stealing optimized pages, but frowned upon by some search engines resulting in penalties. At the same URL (web address) showing one page to a search engine spider and a different page to a human visitor. Cloaking describes the technique of serving a different page to a Search Engine bot/spider than what a human visitor sees.

A robot is a program that automatically traverses the Web's hypertext structure by retrieving a document, and recursively retrieving all documents that are referenced. The robots exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a website which is, otherwise, publicly viewable. Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code.


301 redirect is the most efficient and Search Engine Friendly method for webpage redirection. It's not that hard to implement and it should preserve your search engine rankings for that particular page. If you have to change file names or move pages around, it's the safest option. The code "301" is interpreted as "moved permanently". This is a special redirect used by most SEO Experts.

302 redirect is described in RFC2616 "The requested resource resides temporarily under a different URI. Since the redirection might be altered on occasion, the client SHOULD continue to use the Request-URI for future requests".When a URL returns a 302 redirect, it means that the owner of this link asks users to continue to use this address as the redirect could be modified at some later time.

OBL- outbound-link (OBL) is a link from your website to another website.

A reciprocal link is a mutual link between two objects, commonly between two websites in order to ensure mutual traffic.

One way link is a term used among webmasters for link building methods. It is a Hyperlink that points to a website without any reciprocal link; thus the link goes "one way" in direction.

SERP- Abbreviation for search engine results page, the Web page that a search engine returns with the results of its search.

IBL - Inbound Link - A link from another page that links to your page. A link from a site outside of your site.

A grey hat is a skilled hacker who sometimes acts legally and in good will and sometimes not. They are a hybrid between white and black hat hackers. They hack for no personal gain and do not have malicious intentions, but commit crimes.

Doorway pages are web pages that are created for spamdexing, this is, for spamming the index of a search engine by inserting results for particular phrases with the purpose of sending you to a different page. Doorway pages are easy to identify in that they have been designed primarily for search engines, not for human beings. This page explains how these pages are delivered technically, and some of the problems they pose.

(ex. http://www.tamba.co.uk/web-design-and-development/web-design-uk.htm)

Mirror Site - is a web site that is a replica of an already existing site, used to reduce network traffic (hits on a server) or improve the availability of the original site. Mirror sites are useful when the original site generates too much traffic for a single server to support. Mirror sites also increase the speed with which files or Web sites can be accessed: users can download files more quickly from a server that is geographically closer to them.


A link farm post large amount of unrelated links on their site. Link farm sites and the links that the site provides are pretty much useless and not worth adding links to. (aka FFA - Free for All).

Landing Page - An Entry page is the first page visible to web visitors and is also known as home page.

Spam is flooding the Internet with many copies of the same message, in an attempt to force the message on people who would not otherwise choose to receive it. Most spam is commercial advertising, often for dubious products, get-rich-quick schemes, or quasi-legal services. Spam costs the sender very little to send -- most of the costs are paid for by the recipient or the carriers rather than by the sender.

PageRank is a function of Google that measures the quality of a website on a scale of 0 to10. The theory is that high quality sites receive a higher PageRank based on visitors and traffic your site receives. PageRank is a “vote”, by all the other pages on the internet. PageRank is essentially a score out of ten as to the “value” of your site in comparison to other websites on the Internet. It is based on two primary factors; the number of links you have pointing to your website and the value of the links pointing to your website. Real PR is updated all the time, but toolbar PR is generally updates every couple of months. Some says every 3 months.

Link popularity - A measure of the quantity and quality of sites that link to your site.

Keyword prominence - the measure of how close the keyword appears of the beginning of the sentence. The closer to the beginning your targeted keyword appears, the higher the prominence will be, with a higher keyword prominence being favorable. Prominence applies to word in the title, web page body, META tags, Heading tags, and the alt-tags. It is how close to the start of the area that the keyword appears. In general, a keyword that appears closer to the top of the page or area will be more relevant. However, sometimes it helps to have a keyword in the middle of an area, or even toward the end of the area.

Prominence plays a critical role particularly in directory based engines such as Yahoo. Often having the keyword slightly more towards the beginning of the site description or site title will make a large difference in your ranking.

Examples of prominence:

If a keyword appears at the beginning of an area, its prominence will be 100%.

If a keyword appears in the middle of an area, its prominence will be 50%.

If the keyword appears at the beginning of the area, then another repetition appears at the end of the area, the prominence would be 50%.

If the keyword appears at the end of the area, prominence would be 0%.


Keyword proximity is the measure of closeness between two keywords.


Keyword density - calculates the percentage of keyword words and phrases compared to the total of words in your web page body.

An example of 3-way linking is when a person owns two Web sites, and uses one of the Web sites to offer link exchanges, in an attempt to build up the popularity of the other. In doing so, he or she adds a link to your site on the link exchange Web site, and asks in return that you link to the main site being promoted.

KEI stand for Keyword Effectiveness Index, which measures how effective a keyword is for your web site. It compares the number of search for a keyword (popularity) with the number of search results (competitiveness) to identify which keywords are the most effective for your campaign. The higher the KEI value the more popular the keywords are and the less emulation they have. So the better chance they can get into the top position.


.htaccess is the default name for a file that is used to indicate who can or cannot access the contents of a specific file directory from the Internet or an intranet. The .htaccess file is a configuration file that resides in a directory and indicates which users or groups of users can be allowed access to the files contained in that directory.

LSA - LATENT SYMANTEC ANALYSIS


LSI is a methodology involving statistical probability and correlation that helps deducing the semantic distance between words. It’s obviously a complex methodology but can be easily applied to understand the relation between certain words in a paragraph or in a document. This methodology is being used while indexing a page in the search engine’s database/s. LSA / LSI websites are constructed with the specific intention of taking away the top rankings of ordinary web sites. These new sites are ruthlessly efficient at bumping “ordinary” sites out of the top slots for a very wide range of search phrases. Latent semantic indexing adds an important step to the document indexing process. In addition to recording which keywords a document contains, the method examines the document collection as a whole, to see which other documents contain some of those same words. LSI considers documents that have many words in common to be semantically close, and ones with few words in common to be semantically distant. This simple method correlates surprisingly well with how a human being, looking at content, might classify a document collection. Although the LSI algorithm doesn't understand anything about what the words mean, the patterns it notices can make it seem astonishingly intelligent.


Who is Matt Cutts?


From Wikipedia, Matt Cutts works for the quality group in Google, specializing in search engine optimization issues. He is well known in the SEO community for enforcing the Google Webmaster Guidelines and cracking down on link spam. He is the one of the co-inventors listed upon one of the most well-known patent filings from Google, involving search engines and web spam, which was the first to publicly propose using historical data to identify link spam. His blog is http://www.dullest.com/blog/.


Spider - a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a "crawler" or a "bot." Spiders are typically programmed to visit sites that have been submitted by their owners as new or updated. A program that automatically fetches Web pages. Spiders are used to feed pages to search engines. It's called a spider because it crawls over the Web. Another term for these programs is webcrawler.

Meta Tags - Tags included in an HTML document which are used by search engines to categorize a web site. They include information about keywords and description. Meta elements provide information about a given webpage, most often to help search engines categorize them correctly, and are inserted into the HTML code in the format illustrated above, but are not visible to a user looking at the site. HTML tags that provide information describing the content of the pages a user will be viewing.

Anchor text refers to the visible text for a hyperlink. For example:

< href="http://teamcreatives.com/"> Website Design and Development Outsourcing Company (This is the anchor text)< /a >

Back Link - Any link on another page that points to the subject page. Also called inbound links or IBLs.

Bot - Abbreviation for robot (also called a spider). It refers to software programs that scan the web. Bots vary in purpose from indexing web pages for search engines to harvesting e-mail addresses for spammers.

Cloaking - Cloaking describes the technique of serving a different page to a search engine spider than what a human visitor sees. This technique is abused by spammers for keyword stuffing. Cloaking is a violation of the Terms Of Service of most search engines and could be grounds for banning.

Conversion - Conversion refers to site traffic that follows through on the goal of the site (such as buying a product on-line, filling out a contact form, registering for a newsletter, etc.). Webmasters measure conversion to judge the effectiveness (and ROI) of PPC and other advertising campaigns. Effective conversion tracking requires the use of some scripting/cookies to track visitors actions within a website. Log file analysis is not sufficient for this purpose.

CPC - Abbreviation for Cost Per Click. It is the base unit of cost for a PPC campaign.

CTA - Abbreviation for Content Targeted Ad(vertising). It refers to the placement of relevant PPC ads on content pages for non-search engine websites.

CTR - Abbreviation for Click Through Rate. It is a ratio of clicks per impressions in a PPC campaign.

Gateway Page - Also called a doorway page. A gateway page exists solely for the purpose of driving traffic to another page. They are usually designed and optimized to target one specific keyphrase. Gateway pages rarely are written for human visitors. They are written for search engines to achieve high rankings and hopefully drive traffic to the main site. Using gateway pages is a violation of the Terms Of Service of most search engines and could be grounds for banning.

Keyword/KeyphraseKeywords are words which are used in search engine queries. Keyphrases are multi-word phrases used in search engine queries. SEO is the process of optimizing web pages for keywords and keyphrases so that they rank highly in the results returned.

Keyword stuffing refers to the practice of adding superfluous keywords to a web page. The words are added for the 'benefit' of search engines and not human visitors. The words may or may not be visible to human visitors. While not necessarily a violation of search engine Terms of Service, at least when the words are visible to humans, it detracts from the impact of a page (it looks like spam). It is also possible that search engines may discount the importance of large blocks of text that do not conform to grammatical structures (ie. lists of disconnected keywords). There is no valid reason for engaging in this practice.

PFI - Abbreviation for Pay For Inclusion. Many search engines offer a PFI program to assure frequent spidering / indexing of a site (or page). PFI does not guarantee that a site will be ranked highly (or at all) for a given search term. It just offers webmasters the opportunity to quickly incorporate changes to a site into a search engine's index. This can be useful for experimenting with tweaking a site and judging the resultant effects on the rankings.

Portal - Designation for websites that are either authoritative hubs for a given subject or popular content driven sites (like Yahoo) that people use as their homepage. Most portals offer significant content and offer advertising opportunities for relevant sites.

PPC - Abbreviation for Pay Per Click. An advertising model where advertisers pay only for the traffic generated by their ads.

Google Adwords - PPC program where webmasters can create their own ads and choose keywords.

Robots.txt - Robots.txt is a file which well behaved spiders read to determine which parts of a website they may visit.

SEM - Abbreviation for Search Engine Marketing. SEM encompasses SEO and search engine paid advertising options (banners, PPC, etc.)

Spider Trap refers to either a continuous loop where spiders are requesting pages and the server is requesting data to render the page or an intentional scheme designed to identify (and "ban") spiders that do not respect robots.txt.

Stop Word - Stop words are words that are ignored by search engines when indexing web pages and processing search queries. Common words such as "the".

CPM - Cost per thousand impressions (page views). It is increasingly becoming important as many sites which get a lot of visitors daily and repeatedly having thousands of pages will benefit from CPM instead of CPC if the click through rate is low for the site.

Social bookmarking is an activity performed over a computer network that allows users to save and categorize (see folksonomy) a personal collection of bookmarks and share them with others. Users may also take bookmarks saved by others and add them to their own collection, as well as to subscribe to the lists of others. A personal knowledge management tool.

Sticky Thread is a thread that always shows at the top of the threads listed in a forum.

A CAPTCHA (Completely Automated Public Turing Test to tell Computers and Humans Apart) is a challenge-response system test designed to differentiate humans from automated programs. A CAPTCHA differentiates between human and bot by setting some task that is easy for most humans to perform but is more difficult and time-consuming for current bots to complete. CAPTCHAs are often used to stop bots and other automated programs from using blogs (see splog) to affect search engine rankings, signing up for e-mail accounts to send out spam or take part in on-line polls.

Link bait is content on your site to which other sites link because they want to, not because you ask them to. Link bait is any content or feature within a website that somehow baits viewers to place links to it from other websites. Attempts to create link bait are frequently employed in the overall task of search engine optimization. Matt Cutts defines link bait as anything "... interesting enough to catch people's attention." Link bait can be an extremely powerful form of marketing as it is viral in nature.

Spamdexing was describes the efforts to spam a search engine's index. Spamdexing is a violation of the Terms Of Service of most search engines and could be grounds for banning.

Web design is the ability of designing presentations of content (usually hypertext or hypermedia) that is delivered to an end-user through the World Wide Web, by way of a Web browser or other Web-enabled software like Internet television clients, micro blogging clients and RSS readers. A company gets a website designed to reach out masses through search engines. While designing a web site, proper design should be use with the guidelines laid down by the search engines. This will help to have a website which is more interesting as well as search engine friendly.


Link velocity refers to the speed at which new links to a webpage are formed, and by this term we main gain some new and vital insight. Historically, great bursts of new links to a specific page has been considered a red flag, the quickest way to identify a spammer trying to manipulate the results by creating the appearance of user trust. This led to Google’s famous assaults on link farms and paid link directories.









Labels: , ,

Social Bookmarking Sites | Benefits of Social Bookmarking

Social bookmarking is a process or procedure of Internet users to store, search, manage and organize bookmarks of web pages on the Internet with the help of metadata, that is typically in the form of tags that collectively and/or collaboratively become a folksonomy. Folksonomy is also called social tagging, "the method by which many users add metadata in the form of keywords to shared content".

In a social book marking system, users save links to web pages that they want to remember and/or share. These bookmarks are usually public, and can be saved privately, shared only with specified people or groups, shared only inside certain networks, or another mixture of public and private domains. The allowed people can usually view these bookmarks chronologically, by category or tags, or via a search engine.


BENEFITS OF SOCIAL BOOKMARKING:

1. Quick Indexing: The fastest way to have any webpage or site visited by the search engines and indexed.

2. Traffic Generation: There is more to book-marking your pages online than just getting indexed; guaranteed some traffic from these sites as soon as you submit to them.

3. Personal Branding: Most of the social bookmarking sites like StumbleUpon also give you the opportunity to develop own pages on their site.

4. Quality Backlinks: By building a personal page on a lot of the social book-marking sites.The effects of these is that pages will be frequented by the search engine robots, indexing any new content that you post on it.

Popular Social bookmarking sites that helps in promotion and drives more traffic:

Digg.com
del.icio.us
reddit
Furl
StumbleUpon
Propeller
Zimbio
Jumptags
Yahoo! Buzz
Technorati
Mixx
Spicy Page

More social bookmarking lists at


125 Social Bookmarking Sites
Top 20 Social Bookmarking Sites
81 social bookmarking websites


* Note: Just don't spam! :D

Labels: ,