What is SEO?
SEO stands for Search Engine Optimization and is a term that indicates whether your site appears at the beginning of search results in Google, Bing, Yahoo, etc. Getting to the top of the lists gives you an overwhelming proportion of the click-throughs while showing on page two means you have lost 99% of potential clients. [statistic 31.]
As with many of the impactful technologies nowadays, SEO opened a new business niche and created a market for itself. The product is people's attention.
Every person on Google is a potential customer. They have an idea of what they want, they type it in and end up on your site. This dream scenario is the goal of SEO and the reason people pay experts. The results you get will either let your business flourish or end it.
In this article, we explore practical steps developers can take to either boost the SEO of an existing site or lay the proper foundations for a brand-new business. Either way, you can surely benefit from SEO knowledge as it is marketable and in demand.
What can developers do?
It turns out that you - the developer, have a huge impact on SEO. There are many options before you, so allow me to narrow them down and list a dozen with the biggest impact.
Make your site crawable
Search engines visit your site via a crawling bot. You can think of it as an unregistered user that will only be able to see what a guest user can see. Any information meant for paying users will not be accessible nor taken into account when the bot scores your site. Be thoughtful and provide relative content free of charge so Google and Bing can actually have something to show of you.
Please, get an SSL certificate. Not doing so is the quickest way to sabotage your website's discoverability as all modern browsers will prompt users not to trust your site unless the connection is secure. The reputation cost you pay will outweigh any and all optimizations you have done so far. This is a must.
Crawling bots don't operate the way we do. They cannot look at pictures and decide whether they are helpful. They neither watch nor listen to videos. Bots rather read the source code of the site and do their scoring based on its structure and semantics. HTML is in their blood and hopefully in yours too.
The page title is of primary importance [statistic 18.]. It is set via the
<title> tag and provides the name of your pages and the titles of search results in Google.
You must make sure that all resources in your platform have an identifier that can be used as a page title.
- keep titles 50-60 characters long
- keep it simple, unique and try to use keywords
Descriptions are texts that appear below the titles of search results. They provide additional information and can boost your position and click rates as keywords are bolded and indicate that your site is relevant.
You can hardcode descriptions for index pages, however, you should lay the groundwork and have a separate column when it comes to show pages. Talk it through with your client and leave an input field that admins populate upon document creation.
From a developer standpoint providing the meta description is easy. Add
<meta name="description" content="..."> to the
- aim for a sentence or two (~150 characters)
- make it descriptive and in sync with your content
- use keywords and avoid duplication of meta descriptions
- differentiate by adding an emoji - the color will set you apart and attract clicks; take a look:
Heading tags have a built-in hierarchy. The top-level heading is
<h1>, followed by
<h3>, and so on to the lowest -
<h6>. The proper usage of headings outlines the structure of your text and helps readers scan it quickly. Whenever a heading catches the attention of a visitor, they will surely spend time on your site and improve its rating as Google takes into account the duration of page visits.
Set up your project for success by designing your DB schemas in a manner that allows and guides admins into providing titles, subtitles, lists of questions and answers, etc. that can be transformed into a hierarchy of headings. Such information will prove useful for other SEO practices too.
Bots treat the HTML at the top of the source code as the most important and lower the significance of tags found later. This means that any headers at the top of the site could greatly influence the scoring of your pages even though they are duplicate content. That is unless you surround them with
<nav> tags. When you do that you indicate to crawlers that this content is for navigational purposes and serves to improve the usability of the site.
Structured data is a JSON that you put inside of a
<script type="application/ld+json"> tag. It speaks directly to bots and explains to them what the page they are currently crawling is all about. Structured data also serves as a suggestion of how search engines should display the data.
Try searching for "how to roast a potato" on Google.
Now try "largest volcano".
Sometimes Google crawls credible sources that have well-written structured data, understands what the data represents, and uses it to show default answers for specific searches. As long as the site follows the official markup listed at Schema.org search engines are able to differentiate facts, recipes, people, places, organizations, etc.
OpenGraph was first introduced by Facebook and provided a way to share links and get a nice preview. In time, tech companies adopted the
og: meta properties and OpenGraph became the norm. Here is what you get for a few lines of code:
You can check all available meta data properties, how, and when to use them here. The fundamental ones that you should know are title, description, type, and image. They are added to the
head element like this:
<meta property="og:title" content="Lexis Solutions - Your Ideas, Our Solutions."/>
<meta property="og:description" content="Lexis Solutions is a software agency..."/>
<meta property="og:type" content="website"/>
<meta property="og:image" content="..."/>
- make og:title and og:description the same as the page title and the meta description
- music and video have dedicated meta data properties
- Twitter have their own standard
Sometimes you have duplicate content on multiple pages. The crawling bots of Google will realize that and show only page in the search results. If you want to pick which one that should be you can specify a canonical URL. It's as simple as adding a new
meta tag and it will allow for multiple pages to collect a joint reputation. The code is added to
head element like this:
<link rel="canonical" href="https://lexis.solutions/" />
Let's talk about calendars. The thing with calendars is that the new ones should dethrone the old ones, SEO-wise, immediately. This is easily achieved by taking advantage of canonical URLs, and we have tested it! Generate the URL for the newest calendar and have all calendars point to it "canonically". This way you seamlessly transfer the reputation and appear at the top of Google.
robots meta tag
You can tell crawlers whether to index your pages, follow links, show images in the search results or ignore your page completely. Actually, they understand a broader range of commands that you can read here.
In my estimate, the ones you need to know are noindex, nofollow, and none.
noindex instructs bots that the current page should not be shown in search results. nofollow indicates that you do not trust and do not wish to be associated with links on the current page. none combines both noindex and nofollow.
Your preferred directive is added to the
head element like this
<meta name="robots" content="...">
Bots don't analyze images and fully rely on the
alt attribute to make sense of pictures. All
img tags in your project should include alternative information like this:
<img src="photo.jpg" alt="description">
Provide a relative description as the alt attribute. It does count and will increase your SEO rating. Bear in mind that setting false or misleading descriptions may get you penalties.
By now you should have a basic understanding of the importance of HTML structure and semantics. This knowledge alone should get on the right path.
Your next steps should be aimed at optimizing load speeds, providing sitemaps, resource schema and utilizing some of the tools available out there. Go read and learn more on the topic through the follow-up article on SEO!