Useful links: CSS selectors, OS X shortcuts, Postel’s law, Bob’s Birthday

Understanding CSS Selectors at Script Junkie. Great explanation of attribute selectors, including how to style with attribute selectors when using ARIA roles and forms. The article also explains how to use more than one attribute selector to create styles.

Want to look like a real computer whiz on your Mac? Study A selection of Mac OS X keyboard shortcuts at 456 Berea Street for some lesser known shortcuts that will speed up your day.

Adactio was talking about Going Postel today. He said,

As long as we use progressive enhancement, the front-end stack of HTML, CSS, and JavaScript is remarkably resilient. Remove JavaScript and some behavioural enhancements will no longer function, but everything will still be addressable and accessible. Remove CSS and your lovely visual design will evaporate, but your content will still be addressable and accessible.

I had to go look up Postel’s Law to see what he was referring to. Just in case you’ve never heard of it either, here’s how Wikipedia describes it:

In computing, the robustness principle is a general design guideline for software:

Be conservative in what you send; be liberal in what you accept.

The principle is also known as Postel’s Law, after Internet pioneer Jon Postel, who wrote in an early specification of the Transmission Control Protocol that:

TCP implementations should follow a general principle of robustness: be conservative in what you do, be liberal in what you accept from others.

Today’s is Bob Marley’s birthday. And I cannot resist an occasional foray into the off topic land of iconic music. The wonderful organization Playing for Change has given us this in celebration of the day.

Useful links: Screen readers, Ada Initiative, HTML versioning

Videos of screen readers using ARIA at zomigi.com is enlightening. Watch.

The Ada Initiative is dedicated to increasing participation of women in open technology and culture, which includes open source software, Wikipedia and other open data, and open social media.

HTML as a Living Standard – For and Against is must reading at HTML5 Doctor. Taking on the two sides of the debate are Bruce Lawson and John Foliot.

Useful links: @acarvin, programmatic, HTML5 accessibility

If you’ve been watching events unfolding in Egypt on Twitter you are aware of what Andy Carvin from NPR has been doing in terms of collecting and broadcasting tweets. Here’s a good interview with him from My Heart’s in Accra: Interview with Andy Carvin on curating Twitter to watch Tunisia, Egypt.

“The phrase “programmatically determined” features prominently in six of the 61 WCAG 2.0 Success Criteria.” In Programatically Determined at Accessible Culture, you can find out what it’s all about. Here’s a key bit.

When content is properly marked up in HTML, its semantic structure and relationships are in the markup itself. That is, they can be programmatically determined. Because this information is in the code, as it were, supporting technologies can programmatically retrieve it and present it to users in different ways. The information can be transformed…into different sensory formats (e.g., visual, auditory) or styles of presentation needed by individual users. This is a key aspect of accessible web content and a core concept in WCAG 2.0.

Such information can then be passed along by the browser to whatever other device or software is able to make use of it. Screen readers, voice recognition software, alternative input devices, etc., can tell what each bit of content is and allow users to interact with them accordingly.

Read the comments, too. They are valuable.

HTML5 Accessibility Challenges by Steve Faulkner is a quick summary of some of the issues.

Free eBook on fieldset CSS

The posts here on fieldset CSS are so popular, I decided to put them all together in one PDF file and offer it as a free download.

It’s a PDF file, so it isn’t perfect, but it does give you the information you may be looking for all in one place. If you have any problems with the PDF format, the posts are all available here on the blog. The included articles from the blog are:

Download Web Teacher on Fieldset CSS.

Here is the applicable Creative Commons license info.

You don’t get it until you get it

One of my favorite responses when asked about Twitter is, “You don’t get it until you get it.”

What I mean is that hearing about Twitter makes people scoff and dismiss it. Seeing the public timeline makes people say things about wasting time and having better things to do. I was one of the scoffers at first.

Then I got it.

The reasons I got it are years in the past. I suspect that some new people are getting it in the last few days because of events in Egypt. It’s hard to ignore how significant even 140 characters of open communication can be when freedom and self-determination are at stake.

Those of you who are fans of Gray’s Anatomy saw the Chief get it last night. Dr. Bailey was tweeting from the operating room, and the Chief didn’t like it.

Until he did.

It was a classic conversion to getting Twitter.

If you didn’t see it last night, watch it free on ABC. If you ever need an example of why people think Twitter is important, this episode of Gray’s Anatomy is a perfect teaching tool. There’s a short clip about Dr. Bailey tweeting, but it doesn’t go on long enough for you to see the Chief finally seeing the value of social media. If you run across a clip that goes on for another minute or so past what this clip shows, please share a link to it in the comments.

Social Media and the End of Gender

If you didn’t read it the first time I mentioned it in my Useful Links, I suggest you look at Hubs and Connectors: Understanding Networks Through Data Visualization from Whitney Hess. The mapping concepts discussed there seem closely related to the following TED Talk by Johanna Blakley. We don’t connect like we used to.

So, all you really need to know about me is that I’m interested in web education and I like Buffy the Vampire Slayer.

Changes at Google will Reduce Spammy Search Engine Results

There have been complaints. Loud complaints.

I’m talking about complaints about Google’s search results being full of spammy links that lead to nothing of value. The complaint department window at Google was open for business, because they were listening and they’ve made some changes.

Complaints and Alternatives

Before we get into the changes, check out these examples of the complaints that Google was returning too much search result spam:

Most of the complaints mentioned that Google search results were filled with spam results from content scrapers, marketers, or sites that consisted of nothing but keywords surrounded by useless crappy content.

Often mentioned in relation to spammy results are content farms. Demand Media is a name that come up often when talking about poor quality content. Yet at the same time, many articles like Demand Media Valued at More Than $1 Billion Following IPO also appear. People are investing in Demand media at the same time that people are asking to have its articles removed from search results. In the article about Demand’s IPO, Adam Ostrow said,

Early indications suggest that concerns over Demand — which owns sites like eHow and Trails.com — being at the whims of Google’s search algorithms are not worrying investors. The company increased the size of its offering by nearly 20% and also raised the offering price, with investors pushing shares another 40% higher when they debuted.

Last week, Google wrote about its plans to take on search engine spam and so-called “content farms,” a category that Demand is often lumped in with because of the way it produces its content, with thousands of low-cost freelancers authoring stories that target popular search terms.

I have a few personal things to say about Demand Media, which I will get to in a minute.

As for alternatives to Google, Michelle Rafter organized a chat for writers in WordCount Jan. 26 chat: Google, search spam and search tools for writers that suggested alternative ways for writers to find dependable search results. Her chat announcement:

It follows a series of posts on search skills for writers I’ve done recently, including one on alternatives to Google, an update on Help A Reporter Out (HARO), the website that matches reporters’ requests for sources with companies that could provide the information that Vocus acquried last June, and how to get the most out of a HARO query.

Sounds like a great chat, I’m sorry I missed it.

Alternatives to Google have been popping up regularly. One is a new search engine, Blekko, which specifically claims to remove spammy results from your search. See Blekko De-Spams Search Results with Slashtags. Just yesterday, Blekko announced it will block some sites completely. See Blekko Bans Content Farms Like Demand Media’s eHow from its Search Results.

Someone made an extension for Chrome (Google’s browser) that blocks spam results. See Search Engine Blacklist Prevents Spam Sites from Ever Appearing in Your Search Listings.

How Google Stepped Up

As I mentioned, Google was taking note of all this. Their response appeared on the official Google blog in Google search and search engine spam. Here’s part of that announcement.

To respond to that challenge, we recently launched a redesigned document-level classifier that makes it harder for spammy on-page content to rank highly. The new classifier is better at detecting spam on individual web pages, e.g., repeated spammy words—the sort of phrases you tend to see in junky, automated, self-promoting blog comments. We’ve also radically improved our ability to detect hacked sites, which were a major source of spam in 2010. And we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content. We’ll continue to explore ways to reduce spam, including new ways for users to give more explicit feedback about spammy and low-quality sites.

The Google announcement also reaffirmed Google’s principles that buying Google ads does not increase site rankings and having Google ads does not prevent a site from violating Google’s content guidelines.

Christina Warren quotes Google’s Matt Cutts in Google Changes Algorithm To Penalize Site Scrapers, saying the

“net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site’s content.”

Demand Media and Me

For a while, I worked for eHow.com as their expert in the Internet category. I’m telling you this story because eHow is owned by Demand Media. I wrote several hundred articles for eHow that I stand behind as being of comparable quality to anything I’ve ever written for my own blogs.

While I was well paid for each post I wrote at eHow, there were also writers there who were not paid per post but through a system of rewards based on traffic. Eventually, eHow dropped its experts who were paid by the article. I stopped writing for them at that time. Now, writers only receive rewards based on the traffic their articles generate. eHow also wanted people who were writing there to move to writing keyword laden posts for Demand Media. I chose not to do that.

Personally, I am happy about my relationship with eHow and I’m proud of my posts. I’ve read enough articles there to know that the quality is uneven – some are excellent and some are not.  But I think Blekko’s announcement that they are going to block eHow completely is an overreaction. There is some good material on eHow, especially from the various experts who wrote there. A better choice would be to filter eHow results for quality.

I’ve never looked at any of the content produced on the Demand Media side of the business, but I know they generate vast amounts of it. And its making them money, as you saw from the story on their IPO.

The Results

The changes in Google’s algorithm will affect the profitability of content farm sites like Demand Media’s, because it will bring higher quality results back into the top ranked search results.

The changes will also be helpful to many bloggers who are frustrated by sites that reproduce entire blog posts without permission. If the changes are effective, an original post should rank higher than any scraped content reprinted elsewhere.

Cross-posted at BlogHer in slightly different form.