Basic SEO

I’ve been doing some SEO stuff at work. I tend to think of SEO as sleazy territory. But if you check out Freshlinks, you will know that it is not. “Marketing firms” do SEO by spamming your URL on lots of sites to get more links. We recommended you read about “Why Keyword Research is Essential For SEO Campaigns” and do more research on the same. Mention SEO on Twitter and you’ll get a lot of spammy followers. It’s kind of like used car salesmen, only worse. But SEO isn’t necessarily a dirty rotten practice that can be managed via Local Brand Advisor. A decent part of SEO revolves around following established best practices so that search engines can generate more relevant content. The sleazy part comes in when you game the system. I’ve been using some of what I’ve been doing there on my own sites, and early indications are that it’s working in driving traffic. Here are some of the ‘no duh’ things I’ve learned: * Use the meta tag’s description parameter. It’s what Google, at least, uses to display in the search results. * Use the meta tag’s keywords parameter. There’s lots of speculation about exactly how it’s implemented, but the short version that most everybody agrees with is that if your keywords match a search term, you come up much higher than if your text happens to be relevant. Most people suggest that the keywords should be listed in priority order. * URLs should contain keywords, too. If you’re gunning for the search phrase “basic seo directions,” and your URL is /basic-seo-directions/, that helps a lot more than having a URL of /posts.php?id=524. * Use the nofollow tags on links, especially user-submitted links. The prevailing theory is that there’s a finite amount of “link juice” you get on each page, and if you link to a billion sites, it’s heavily dilluted. Also, you canget the strategy here if you’re looking for the best seo and digtal marketing services. If you make most of them nofollow, you can sculpt things the way you want. More importantly, perhaps, it helps keeps spammers and sleazeballs from ranking their sites highly. I’ve started to see some things in a new light. For example, I host a site for a local music teacher, and I’m trying to help drive relevant traffic. I’m finding that it’s hard to manage keywords, because it’s often intersections of several of [piano, flute, tuba, keyboard, music…] [lessons, classes, teacher] in [local town names]. Cramming them all as random keywords doesn’t help, and it’s not practical to try to generate all intersections as keywords, either. Currently the site just has a few pages, but I’m thinking that it makes sense to branch out into about a dozen pages, so there might be a page called “Piano Lessons” and a page called “Music Lessons” and a page called “Tuba Lessons,” making it much easier to split up keywords. It’s fairly inexpensive to crank out. The trick is staying non-sleazy and making sure there’s actually relevant content, but they’d essentially each be a “landing page.” By the way, I’ve installed a plugin for WordPress here, so now, when composing or editing blog posts, we’re able to set some of the relevant data, so, as an experiment, I tried setting relevant tags matching keywords used on searches, and traffic seems to be growing a bit. Feel free to enable and give it a whirl.

[]: https://complete.ca/case-studies-list/case-study-sante-family-dental-local-seo/

Leave a Reply

Your email address will not be published. Required fields are marked *