Web Accessibility Articles and Resources


[CaRP] XML error: EntityRef: expecting ';' at line 1

[CaRP] php_network_getaddresses: getaddrinfo failed: Name or service not known (0)

The Cult of the Complex

31 May 2018 at 6:15am

?Tis a gift to be simple. Increasingly, in our line of work, ?tis a rare gift indeed.

In an industry that extols innovation over customer satisfaction, and prefers algorithm to human judgement (forgetting that every algorithm has human bias in its DNA), perhaps it should not surprise us that toolchains have replaced know-how.

Likewise, in a field where young straight white dudes take an overwhelming majority of the jobs (including most of the management jobs) it?s perhaps to be expected that web making has lately become something of a dick measuring competition.

It was not always this way, and it needn?t stay this way. If we wish to get back to the business of quietly improving people?s lives, one thoughtful interaction at a time, we must rid ourselves of the cult of the complex. Admitting the problem is the first step in solving it.

And the div cries Mary

In 2001, more and more of us began using CSS to replace the non-semantic HTML table layouts with which we?d designed the web?s earliest sites. I soon noticed something about many of our new CSS-built sites. I especially noticed it in sites built by the era?s expert backend coders, many of whom viewed HTML and CSS as baby languages for non-developers.

In those days, whether from contempt for the deliberate, intentional (designed) limitations of HTML and CSS, or ignorance of the HTML and CSS framers? intentions, many code jockeys who switched from table layouts to CSS wrote markup consisting chiefly of divs and spans. Where they meant list item, they wrote span. Where they meant paragraph, they wrote div. Where they meant level two headline, they wrote div or span with a classname of h2, or, avoiding even that tragicomic gesture toward document structure, wrote a div or span with verbose inline styling. Said div was followed by another, and another. They bred like locusts, stripping our content of structural meaning.

As an early adopter and promoter of CSS via my work in The Web Standards Project (kids, ask your parents), I rejoiced to see our people using the new language. But as a designer who understood, at least on a basic level, how HTML and CSS were supposed to work together, I chafed.

Cry, the beloved font tag

Everyone who wrote the kind of code I just described thought they were advancing the web merely by walking away from table layouts. They had good intentions, but their executions were flawed. My colleagues and I here at A List Apart were thus compelled to explain a few things.

Mainly, we argued that HTML consisting mostly of divs and spans and classnames was in no way better than table layouts for content discovery, accessibility, portability, reusability, or the web?s future. If you wanted to build for people and the long term, we said, then simple, structural, semantic HTML was best?each element deployed for its intended purpose. Don?t use a div when you mean a p.

This basic idea, and I use the adjective advisedly, along with other equally rudimentary and self-evident concepts, formed the basis of my 2003 book Designing With Web Standards, which the industry treated as a revelation, when it was merely common sense.

The message messes up the medium

When we divorce ideas from the conditions under which they arise, the result is dogma and misinformation?two things the internet is great at amplifying. Somehow, over the years, in front-end design conversations, the premise ?don?t use a div when you mean a p? got corrupted into ?divs are bad.?

A backlash in defense of divs followed this meaningless running-down of them?as if the W3C had created the div as a forbidden fruit. So, let?s be clear. No HTML element is bad. No HTML element is good. A screwdriver is neither good nor bad, unless you try to use it as a hammer. Good usage is all about appropriateness.

Divs are not bad. If no HTML5 element is better suited to an element?s purpose, divs are the best and most appropriate choice. Common sense, right? And yet.

Somehow, the two preceding simple sentences are never the takeaway from these discussions. Somehow, over the years, a vigorous defense of divs led to a defiant (or ignorant) overuse of them. In some strange way, stepping back from a meaningless rejection of divs opened the door to gaseous frameworks that abuse them.

Note: We don?t mind so much about the abuse of divs. After all, they are not living things. We are not purists. It?s the people who use the stuff we design who suffer from our uninformed or lazy over-reliance on these div-ridden gassy tools. And that suffering is what we protest. div-ridden, overbuilt frameworks stuffed with mystery meat offer the developer tremendous power?especially the power to build things quickly. But that power comes at a price your users pay: a hundred tons of stuff your project likely doesn?t need, but you force your users to download anyway. And that bloat is not the only problem. For who knows what evil lurks in someone else?s code?

Two cheers for frameworks

If you entered web design and development in the past ten years, you?ve likely learned and may rely on frameworks. Most of these are built on meaningless arrays of divs and spans?structures no better than the bad HTML we wrote in 1995, however more advanced the resulting pages may appear. And what keeps the whole monkey-works going? JavaScript, and more JavaScript. Without it, your content may not render. With it, you may deliver more services than you intended to.

There?s nothing wrong with using frameworks to quickly whip up and test product prototypes, especially if you do that testing in a non-public space. And theoretically, if you know what you?re doing, and are willing to edit out the bits your product doesn?t need, there?s nothing wrong with using a framework to launch a public site. Notice the operative phrases: if you know what you?re doing, and are willing to edit out the bits your product doesn?t need.

Alas, many new designers and developers (and even many experienced ones) feel like they can?t launch a new project without dragging in packages from NPM, or Composer, or whatever, with no sure idea what the code therein is doing. The results can be dangerous. Yet here we are, training an entire generation of developers to build and launch projects with untrusted code.

Indeed, many designers and developers I speak with would rather dance naked in public than admit to posting a site built with hand-coded, progressively enhanced HTML, CSS, and JavaScript they understand and wrote themselves. For them, it?s a matter of job security and viability. There?s almost a fear that if you haven?t mastered a dozen new frameworks and tools each year (and by mastered, I mean used), you?re slipping behind into irrelevancy. HR folks who write job descriptions listing the ten thousand tool sets you?re supposed to know backwards and forwards to qualify for a junior front-end position don?t help the situation.

CSS is not broken, and it?s not too hard

As our jerrybuilt contraptions, lashed together with fifteen layers of code we don?t understand and didn?t write ourselves, start to buckle and hiss, we blame HTML and CSS for the faults of developers. This fault-finding gives rise to ever more complex cults of specialized CSS, with internecine sniping between cults serving as part of their charm. New sects spring up, declaring CSS is broken, only to splinter as members disagree about precisely which way it?s broken, or which external technology not intended to control layout should be used to ?fix? CSS. (Hint: They mostly choose JavaScript.)

Folks, CSS is not broken, and it?s not too hard. (You know what?s hard? Chasing the ever-receding taillights of the next shiny thing.) But don?t take my word for it. Check these out:

Getting Started with CSS Layout?Rachel Andrew, Smashing Magazine Learn CSS Grid?Jen Simmons CSS Grid Layout?MDN web docs Grid by Example?Rachel Andrew A Complete Guide to Grid?Chris House, CSS-Tricks Practical CSS Grid: Adding Grid to an Existing Design?Eric Meyer, A List Apart Jen Simmons Labs Layout Land?YouTube A Book Apart: The New CSS Layout, by Rachel Andrew The Story of CSS Grid, from its Creators?Aaron Gustafson, A List Apart Transcript: Intrinsic Web Design with Jen Simmons (The Big Web Show)

CSS Grid is here; it?s logical and fairly easy to learn. You can use it to accomplish all kinds of layouts that used to require JavaScript and frameworks, plus new kinds of layout nobody?s even tried yet. That kind of power requires some learning, but it?s good learning, the kind that stimulates creativity, and its power comes at no sacrifice of semantics, or performance, or accessibility. Which makes it web technology worth mastering.

The same cannot be said for our deluge of frameworks and alternative, JavaScript-based platforms. As a designer who used to love creating web experiences in code, I am baffled and numbed by the growing preference for complexity over simplicity. Complexity is good for convincing people they could not possibly do your job. Simplicity is good for everything else.

Keep it simple, smarty

Good communication strives for clarity. Design is its most brilliant when it appears most obvious?most simple. The question for web designers should never be how complex can we make it. But that?s what it has become. Just as, in pursuit of ?delight,? we forget the true joy reliable, invisible interfaces can bring, so too, in chasing job security, do we pile on the platform requirements, forgetting that design is about solving business and customer problems ? and that baseline skills never go out of fashion. As ALA?s Brandon Gregory, writing elsewhere, explains:

I talk with a lot of developers who list Angular, Ember, React, or other fancy JavaScript libraries among their technical skills. That?s great, but can you turn that mess of functions the junior developer wrote into a custom extensible object that we can use on other projects, even if we don?t have the extra room for hefty libraries? Can you code an image slider with vanilla JavaScript so we don?t have to add jQuery to an older website just for one piece of functionality? Can you tell me what recursion is and give me a real-world example? “I interview web developers. Here’s how to impress me.” Growing pains

There?s a lot of complexity to good design. Technical complexity. UX complexity. Challenges of content and microcopy. Performance challenges. This has never been and never will be an easy job.

Simplicity is not easy?not for us, anyway. Simplicity means doing the hard work that makes experiences appear seamless?the sweat and torture-testing and failure that eventually, with enough effort, yields experiences that seem to ?just work.?

Nor, in lamenting our industry?s turn away from basic principles and resilient technologies, am I suggesting that CDNs and Git are useless. Or wishing that we could go back to FTP?although I did enjoy the early days of web design, when one designer could do it all. I?m glad I got to experience those simpler times.

But I like these times just fine. And I think you do, too. Our medium is growing up, and it remains our great privilege to help shape its future while creating great experiences for our users. Let us never forget how lucky we are, nor, in chasing the ever-shinier, lose sight of the people and purpose we serve.


Onboarding: A College Student Discovers A List Apart

22 May 2018 at 6:02am

What would you say if I told you I just read and analyzed over 350 articles from A List Apart in less than six weeks? ?You?re crazy!? might have passed through your lips. In that case, what would you say if I was doing it for a grade? Well, you might say that makes sense.

As a part of an Independent Research Study for my undergraduate degree, I wanted to fill in some of the gaps I had when it came to working with the World Wide Web. I wanted to know more about user experience and user interface design, however, I needed the most help getting to know the industry in general. Naturally, my professor directed me to A List Apart.

At first I wasn?t sure what I was going to get out of the assignment other than the credit I needed to graduate. What could one website really tell me? As I read article after article, I realized that I wasn?t just looking at a website?I was looking at a community. A community with history in which people have struggled to build the right way. One that is constantly working to be open to all. One that is always learning, always evolving, and sometimes hard to keep up with. A community that, without my realizing it, I had become a part of. For me, the web has pretty much always been there, but now that I am better acquainted with its past, I am energized to be a part of its future. Take a look at some of the articles that inspired this change in me.

A bit of history

I started in the Business section and went back as far as November 1999. What a whirlwind that was! I had no idea what people went through and the battles that they fought to make the web what it is today. Now, I don?t mean to date any of you lovely readers, but I would have been three years old when the first business article on A List Apart was published, so everything I read until about 2010 was news to me.

For instance, when I came across Jeffrey Zeldman?s ?Survivor! (How Your Peers Are Coping with the Dotcom Crisis)? that was published in 2001, I had no idea what he was talking about! The literal note I wrote for that article was: ?Some sh** went down in the late 1990s???? I was in the dark until I had the chance to Google it and sheepishly ask my parents.

I had the same problem with the term Web 2.0. It wasn?t until I looked it up that I realized I didn?t know what it was, because I never experienced Web 1.0 (having not had access to the internet until 2004). In that short time, the industry had completely reinvented itself before I ever had a chance to log on!

The other bit of history that surprised me was how long and hard people had to fight to get web standards and accessibility in line. In school I?ve always been taught to make my sites accessible, and that just seemed like common sense to me. I guess I now understand why I have mixed feelings about Flash.

What I learned about accessibility

Accessibility is one of the topics I took a lot of notes on. I was glad to see that although a lot of progress had been made in this area, people were still taking the time to write about and constantly make improvements to it. In Beth Raduenzel?s ?A DIY Web Accessibility Blueprint,? she explains the fundamentals to remember when designing for accessibility, including considering:

keyboard users; blind users; color-blind users; low-vision users; deaf and hard-of-hearing users; users with learning disabilities and cognitive limitations; mobility-impaired users; users with speech disabilities; and users with seizure disorders.

It was nice to have someone clearly spell it out. However, the term ?user? was used a lot. This distances us from the people we are supposed to be designing for. Anne Gibson feels the same way; in her article, she states that ?[web] accessibility means that people can use the web.? All people. In ?My Accessibility Journey: What I?ve Learned So Far,? Manuel Matuzovi? gives exact examples of this:

If your site takes ten seconds to load on a mobile connection, it?s not accessible. If your site is only optimized for one browser, it?s not accessible. If the content on your site is difficult to understand, your site isn?t accessible.

It goes beyond just people with disabilities (although they are certainly not to be discounted).

I learned a lot of tips for designing with specific people in mind. Like including WAI-ARIA in my code to benefit visually-impaired users, and checking the color contrast of my site for people with color blindness and low-vision problems. One article even inspired me to download a Sketch plugin to easily check the contrast of my designs in the future. I?m more than willing to do what I can to allow my website to be accessible to all, but I also understand that it?s not an easy feat, and I will never get it totally right.

User research and testing methods that were new to me

Nevertheless, we still keep learning. Another topic on A List Apart I desperately wanted to absorb was the countless research, testing, and development methods I came across in my readings. Every time I turn around, someone else has come up with another way of working, and I?m always trying to keep my finger in the pie.

I?m happy to report that the majority of the methods I read about I already knew about and have used in my own projects at school. I?ve been doing open interview techniques, personas, style tiles, and element collages all along, but I was surprised by how many new practices I?d come across.

The Kano Model, the Core Model, Wizard of Oz prototyping, and think-alouds were some of the methods that piqued my curiosity. Others like brand architecture research, call center log analysis, clickstream analysis, search analytics, and stakeholder reviews I?ve heard of before, but have never been given the opportunity to try. 

Unattended qualitative research, A/B testing and fake-door testing are those that stood out to me. I liked that they allow you to conduct research even if you don?t have any users in front of you. I learned a lot of new terms and did a lot of research in this section. After all, it?s easy to get lost in all the jargon.

The endless amount of abbreviations

I spent a lot of my time Googling terms during this project?especially with the older articles that mentioned programs like Fireworks that aren?t really used anymore. One of my greatest fears in working with web design is that someone will ask me something and I will have no idea what they are talking about. When I was reading all the articles, I had the hardest time with the substantial amount of abbreviations I came across: AJAX, API, ARIA, ASCII, B2B, B2C, CMS, CRM, CSS, EE, GUI, HTML, IIS, IPO, JSP, MSA, RFP, ROI, RSS, SASS, SEM, SEO, SGML, SOS, SOW, SVN, and WYSIWYG, just to name a few. Did you manage to get them all? Probably not.

We don?t use abbreviations in school because they aren?t always clear and the professors know we won?t know what they mean. To a newbie like me, these abbreviations feel like a barrier. A wall that divides the veterans of the industry and those trying to enter it. I can?t imagine how the clients must feel.

It seems as if I am not alone in my frustrations. Inayaili de León says in her article ?Becoming Better Communicators,? ?We want people to care about design as much as we do, but how can they if we speak to them in a foreign language?? I?m training to be a designer, I?m in Design, and I had to look up almost every abbreviation listed above.

What I learned about myself

Prior to taking on this assignment, I would have been very hesitant to declare myself capable of creating digital design. To my surprise, I?m not alone. Matt Griffin thinks, ?? the constant change and adjustments that come with living on the internet can feel overwhelming.? Kendra Skeene admits, ?It?s a lot to keep track of, whether you?ve been working on the web for [twenty] years or only [twenty] months.?

My fear of not knowing all the fancy lingo was lessened when I read Lyza Danger Gardner?s ?Never Heard of It.? She is a seasoned professional who admits to not knowing it all, so I, a soon-to-be-grad, can too. I have good foundations and Google on my side for those pesky abbreviations that keep popping up. As long as I just remember to use my brain as Dave Rupert suggests, when I go to get a job I should do just fine.

Entering the workplace

Before starting this assignment, I knew I wanted to work in digital and interaction design, but I didn?t know where. I was worried I didn?t know enough about the web to be able to design for it?that all the jobs out there would require me to know coding languages I?d never heard of before, and I?d have a hard time standing out among the crowd.

The articles I read on A List Apart supplied me with plenty of solid career advice. After reading articles written by designers, project managers, developers, marketers, writers, and more, I?ve come out with a better understanding of what kind of work I want to do. In the article ?80/20 Practitioners Make Better Communicators,? Katie Kovalcin makes a good point about not forcing yourself to learn skills just because you feel the need to:

We?ve all heard the argument that designers need to code. And while that might be ideal in some cases, the point is to expand your personal spectrum of skills to be more useful to your team, whether that manifests itself in the form of design, content strategy, UX, or even project management. A strong team foundation begins by addressing gaps that need to be filled and the places where people can meet in the middle.

I already have skills that someone desperately needs. I just need to find the right fit and expand my skills from there. Brandon Gregory also feels that hiring isn?t all about technical knowledge. In his article, he says, ?personality, fit with the team, communication skills, openness to change, [and] leadership potential? are just as important.

Along with solid technical fundamentals and good soft skills, it seems as if having a voice is also crucial. When I read Jeffrey Zeldman?s article ?The Love You Make,? it became clear to me that if I ever wanted to get anywhere with my career, I was going to have to start writing.

Standout articles

The writers on A List Apart have opened my eyes to many new subjects and perspectives on web design. I particularly enjoyed looking through the game design lens in Graham Herrli?s ?Gaming the System ? and Winning.? It was one of the few articles where I copied his diagram on interaction personality types and their goals into my notebook. Another article that made me consider a new perspective was ?The King vs. Pawn Game of UI Design? by Erik Kennedy. To start with one simple element and grow from there really made something click in my head.

However, I think that the interview I read between Mica McPheeters and Sara Wachter-Boettcher stuck with me the most. I actually caught myself saying ?hmm? out loud as I was reading along. Sara?s point about crash-test dummies being sized to the average male completely shifted my understanding about how important user-centered design is. Like, life-or-death important. There is no excuse not to test your products or services on a variety of users if this is what?s at stake! It?s an article I?m glad I read.

Problems I?ve noticed in the industry

During the course of my project, I noticed some things about A List Apart that I was spending so much time on. Like, for example, it wasn?t until I got to the articles that were published after 2014 that I really started to understand and relate to the content; funnily enough, that was the year I started my design degree.

I also noticed that it was around this time that female writers became much more prominent on the site. Today there may be many women on A List Apart, but I must point out a lack of women of color. Shoutout to Aimee Gonzalez-Cameron for her article ?Hello, My Name is <Error>,? a beautiful assertion for cultural inclusion on the web through user-centered design.

Despite the lack of representation of women of color, I was very happy to see many writers acknowledge their privilege in the industry. Thanks to Cennydd Bowles, Matt Griffin, and Rian van der Merwe for their articles. My only qualm is that the topic of privilege has only appeared on A List Apart in the last five years. Because isn?t it kinda ironic? As creators of the web we aim to allow everyone access to our content, but not everyone has access to the industry itself. Sara Wachter-Boettcher wrote an interesting article that expands on this idea, which you should read if you haven?t already. However, I won?t hold it against any of you. That?s why we are here anyway: to learn.

The takeaway

Looking back at this assignment, I?m happy to say that I did it. It was worth every second (even with the possible eye damage from reading off my computer screen for hours on end). It was worth it because I learned more than I had ever anticipated. I received an unexpected history lesson of the recent internet past. I was bombarded by an explosion of new terms and abbreviations. I learned a lot about myself and how I can possibly fit into this community. Most importantly, I came out on the other end with more confidence in myself and my abilities?which is probably the greatest graduation gift I could receive from a final project in my last year of university. Thanks for reading, and wish me luck!

Thanks

Thanks to my Interactive Design professor Michael LeBlanc for giving me this assignment and pushing me to take it further.


So You Want to Write an Article?

10 May 2018 at 7:30am

So you want to write an article. Maybe you?ve got a great way of organizing your CSS, or you?re a designer who has a method of communicating really well with developers, or you have some insight into how to best use a new technology. Whatever the topic, you have insights, you?ve read the basics of finding your voice, and you?re ready to write and submit your first article for a major publication. Here?s the thing: most article submissions suck. Yours doesn?t have to be one of them.

At A List Apart, we want to see great minds in the industry write the next great articles, and you could be one of our writers. I?ve been on the editorial team here for about nine months now, and I?ve written a fair share of articles here as well. Part of what I do is review article submissions and give feedback on what?s working and what?s not. We publish different kinds of articles, but many of the submissions I see?particularly from newer writers?fall into the same traps. If you?re trying to get an article published in A List Apart or anywhere else, knowing these common mistakes can help your article?s chances of being accepted.

Keep introductions short and snappy

Did you read the introduction above? My guess is a fair share of readers skipped straight to this point. That?s pretty typical behavior, especially for articles like this one that offer several answers to one clear question. And that?s totally fine. If you?re writing, realize that some people will do the same thing. There are some things you can do to improve the chances of your intro being read, though.

Try to open with a bang. A recent article from Caroline Roberts has perhaps the best example of this I?ve ever seen: ?I won an Emmy for keeping a website free of dick pics.? When I saw that in the submission, I was instantly hooked and read the whole thing. It?s hilarious, it shows she has expertise on managing content, and it shows that the topic is more involved and interesting than it may at first seem. A more straightforward introduction to the topic of content procurement would seem very boring in comparison. Your ideas are exciting, so show that right away if you can. A funny or relatable story can also be a great way to lead into an article?just keep it brief!

If you can?t open with a bang, keep it short. State the problem, maybe put something about why it matters or why you?re qualified to write about it, and get to the content as quickly as possible. If a line in your introduction does not add value to the article, delete it. There?s little room for meandering in professional articles, but there?s absolutely no room for it in introductions.

Get specific

Going back to my first article submission for A List Apart, way before I joined the team, I wanted to showcase my talent and expertise, and I thought the best way to do this was to showcase all of it in one article. I wrote an overview of professional skills for web professionals. There was some great information in there, based on my years of experience working up through the ranks and dealing with workplace drama. I was so proud when I submitted the article. It wasn?t accepted, but I got some great feedback from the editor-in-chief: get more specific.

The most effective articles I see deal with one central idea. The more disparate ideas I see in an article, the less focused and impactful the article is. There will be exceptions to this, of course, but those are rarer than articles that suffer for this. Don?t give yourself a handicap by taking an approach that fails more often than it succeeds.

Covering one idea in great detail, with research and examples to back it up, usually goes a lot further in displaying your expertise than an overview of a bunch of disparate thoughts. Truth be told, a lot of people have probably arrived at the same ideas you have. The insights you have are not as important as your evidence and eloquence in expressing them.

Can an overview article work? Actually, yes, but you need to frame it within a specific problem. One great example I saw was an overview of web accessibility (which has not been published yet). The article followed a fictional project from beginning to end, showing how each team on the project could work toward a goal of accessibility. But the idea was not just accessibility?it was how leaders and project managers could assign responsibility in regards to accessibility. It was a great submission because it began with a problem of breadth and offered a complete solution to that problem. But it only worked because it was written specifically for an audience that needed to understand the whole process. In other words, the comprehensive nature of the article was the entire point, and it stuck to that.

Keep your audience in mind

You have a viewpoint. A problem I frequently see with new submissions is forgetting that the audience also has its viewpoint. You have to know your audience and remember how the audience?s mindset matches yours?or doesn?t. In fact, you?ll probably want to state in your introduction who the intended audience is to hook the right readers. To write a successful article, you have to keep that audience in mind and write for it specifically.

A common mistake I see writers make is using an article to vent their frustrations about the people who won?t listen to them. The problem is that the audience of our publication usually agrees with the author on these points, so a rant about why he or she is right is ultimately pointless. If you?re writing for like-minded people, it?s usually best to assume the readers agree with you and then either delve into how to best accomplish what you?re writing about or give them talking points to have that conversation in their workplace. Write the kind of advice you wish you?d gotten when those frustrations first surfaced.

Another common problem is forgetting what the audience already knows?or doesn?t know. If something is common knowledge in your industry, it doesn?t need another explanation. You might link out to another explanation somewhere else just in case, but there?s no need to start from scratch when you?re trying to make a new point. At the same time, don?t assume that all your readers have the same expertise you do. I wrote an article on some higher-level object-oriented programming concepts?something many JavaScript developers are not familiar with. Rather than spend half the article giving an overview of object-oriented programming, though, I provided some links at the beginning of the article that gave a good overview. Pro tip: if you can link out to articles from the same publication you?re submitting to, publications will appreciate the free publicity.

Defining your audience can also really help with knowing their viewpoint. Many times when I see a submission with two competing ideas, they?re written for different audiences. In my article I mentioned above, I provide some links for developers who may be new to object-oriented programming, but the primary audience is developers who already have some familiarity with it and want to go deeper. Trying to cater to both audiences wouldn?t have doubled the readership?it would have reduced it by making a large part of the article less relevant to readers.

Keep it practical

I?ll admit, of all these tips, this is the one I usually struggle with the most. I?m a writer who loves ideas, and I love explaining them in great detail. While there are some readers who appreciate this, most are looking for some tangible ways to improve something. This isn?t to say that big concepts have no place in professional articles, but you need to ask why they are there. Is your five-paragraph explanation of the history of your idea necessary for the reader to make the improvements you suggest?

This became abundantly clear to me in my first submission of an article on managing ego in the workplace. I love psychology and initially included a lengthy section up-front on how our self-esteem springs from the strengths we leaned on growing up. While this fascinated me, it wasn?t right for an audience of web professionals who wanted advice on how to improve their working relationships. Based on feedback I received, I removed the section entirely and added a section on how to manage your own ego in the workplace?much more practical, and that ended up being a favorite section in the final piece.

Successful articles solve a problem. Begin with the problem?set it up in your introduction, maybe tell a little story that illustrates how this problem manifests?and then build a case for your solution. The problem should be clear to the reader very early on in the article, and the rest of the article should all be related to that problem. There is no room for meandering and pontification in a professional article. If the article is not relevant and practical, the reader will move on to something else.

The litmus test for determining the practicality of your article is to boil it down to an outline. Of course all of your writing is much more meaningful than an outline, but look at the outline. There should be several statements along the lines of ?Do this,? or ?Don?t do this.? You can have other statements, of course, but they should all be building toward some tangible outcome with practical steps for the reader to take to solve the problem set up in your introduction.

It?s a hard truth you have to learn as a writer that you?ll be much more in love with your ideas than your audience will. Writing professional articles is not about self-expression?it?s about helping and serving your readers. The more clear and concise the content you offer, the more your article will be read and shared.

Support what you say

Your opinions, without evidence to support them, will only get you so far. As a writer, your ideas are probably grounded in a lot of real evidence, but your readers don?t know that?you?ll have to show it. How do you show it? Write a first draft and get your ideas out. Then do another pass to look for stories, stats, and studies to support your ideas. Trying to make a point without at least one of these is at best difficult and at worst empty hype. Professionals in your industry are less interested in platitudes and more interested in results. Having some evidence for your claims goes a long way toward demonstrating your expertise and proving your point.

Going back to my first article in A List Apart, on defusing workplace drama, I had an abstract point to prove, and I needed to show that my insights meant something. My editor on that article was fantastic and asked the right questions to steer me toward demonstrating the validity of my ideas in a meaningful way. Personal stories made up the backbone of the article, and I was able to find social psychology studies to back up what I was saying. These illustrations of the ideas ended up being more impactful than the ideas themselves, and the article was very well-received in the community.

Storytelling can be an amazing way to bring your insights to life. Real accounts or fictional, well-told stories can serve to make big ideas easier to understand, and they work best when representing typical scenarios, not edge cases. If your story goes against common knowledge, readers will pick up on that instantly and you?ll probably get some nasty comments. Never use a story to prove a point that doesn?t have any other hard evidence to back it up?use stories to illustrate points or make problems more relatable. Good stories are often the most memorable parts of articles and make your ideas and assertions easier to remember.

Stats are one of the easiest ways to make a point. If you?re arguing that ignoring website accessibility can negatively impact the business, some hard numbers are going to say a lot more than stories. If there?s a good stat to prove your point, always include it, and always be on the lookout for relevant numbers. As with stories, though, you should never try to use stats to distort the truth or prove a point that doesn?t have much else to support it. Mark Twain once said, ?There are three kinds of lies: lies, damned lies, and statistics.? You shouldn?t decide what to say and then scour the internet for ways to back it up. Base your ideas on the numbers, don?t base your selection of facts on your idea.

Studies, including both user experience studies and social psychology experiments, are somewhere in between stories and stats, and a lot of the same advantages and pitfalls also apply. A lot of studies can be expressed as a story?write a quick bit from the point of view of the study participant, then go back and explain what?s really going on. This can be just as engaging and memorable as a good story, but studies usually result in stats, which usually serve to make the stories significantly more authoritative. And remember to link out to the study for people who want to read more about it!

Just make sure your study wasn?t disproved by later studies. In my first article, linked above, I originally referenced a study to introduce the bystander effect, but an editor wisely pointed out that there?s actually a lot of evidence against that interpretation of the well-known study. Interpretations can change over time, especially as new information comes out. I found a later, more relevant study that illustrated the point better and was less well-known, so it made for a better story.

Kill your darlings

Early twentieth century writer and critic Arthur Quiller-Couch once said in a speech, ?Whenever you feel an impulse to perpetrate a piece of exceptionally fine writing, obey it?whole-heartedly?and delete it before sending your manuscript to press. Murder your darlings.? Variants of this quote were repeated by many authors throughout the twentieth century, and it?s just as true today as when he originally said it.

What does that mean for your article? Great prose, great analogies, great stories?any bits of brilliant writing that you churn out?only mean as much as they contribute to the subject at hand. If it doesn?t contribute anything, it needs to be killed.

When getting your article ready for submission, your best friend will be the backspace or delete key on your keyboard. Before submitting, do a read-through for the express purpose of deleting whatever you can to trim down the article. Articles are not books. Brevity is a virtue, and it usually ends up being one of the most important virtues in article submissions.

Your intro should have a clear thesis so readers know what the article is about. For every bit of writing that follows it, ask if it contributes to your argument. Does it illustrate the problem or solution? Does it give the reader empathy for or understanding of the people you?re trying to help? Does it give them guidance on how to have these conversations in their workplaces? If you can?t relate a sentence back to your original thesis, it doesn?t matter how brilliant it is?it should be deleted.

Humor can be useful, but many jokes serve as little more than an aside or distraction from the main point. Don?t interrupt your train of thought with a cute joke?use a joke to make your thoughts more clear. It doesn?t matter how funny the joke is; if it doesn?t help illustrate or reinforce one of your points, it needs to go.

There are times when a picture really is worth a thousand words. Don?t go crazy with images and illustrations in your piece, but if a quick graphic is going to save you a lengthy explanation, go that route.

So what are you waiting for?

The industry needs great advice in articles, and many of you could provide that. The points I?ve delved into in this article aren?t just formalities and vague ideas; the editing team at A List Apart has weighed in, and these are problems we see often that weaken articles and make them less accessible to readers. Heeding this advice will strengthen your professional articles, whether you plan to submit to A List Apart or anywhere else. The next amazing article in A List Apart could be yours, and we hope to see you get there.


The Illusion of Control in Web Design

26 Apr 2018 at 6:02am

We all want to build robust and engaging web experiences. We scrutinize every detail of an interaction. We spend hours getting the animation swing just right. We refactor our JavaScript to shave tiny fractions of a second off load times. We control absolutely everything we can, but the harsh reality is that we control less than we think.

Last week, two events reminded us, yet again, of how right Douglas Crockford was when he declared the web ?the most hostile software engineering environment imaginable.? Both were serious enough to take down an entire site?actually hundreds of entire sites, as it turned out. And both were avoidable.

In understanding what we control (and what we don?t), we will build resilient, engaging products for our users.

What happened?

The first of these incidents involved the launch of Chrome 66. With that release, Google implemented a security patch with serious implications for folks who weren?t paying attention. You might recall that quite a few questionable SSL certificates issued by Symantec Corporation?s PKI began to surface early last year. Apparently, Symantec had subcontracted the creation of certificates without providing a whole lot of oversight. Long story short, the Chrome team decided the best course of action with respect to these potentially bogus (and security-threatening) SSL certificates was to set an ?end of life? for accepting them as secure. They set Chrome 66 as the cutoff.

So, when Chrome 66 rolled out (an automatic, transparent update for pretty much everyone), suddenly any site running HTTPS on one of these certificates would no longer be considered secure. That?s a major problem if the certificate in question is for our primary domain, but it?s also a problem it?s for a CDN we?re using. You see, my server may be running on a valid SSL certificate, but if I have my assets?images, CSS, JavaScript?hosted on a CDN that is not secure, browsers will block those resources. It?s like CSS Naked Day all over again.

To be completely honest, I wasn?t really paying attention to this until Michael Spellacy looped me in on Twitter. Two hundred of his employer?s sites were instantly reduced to plain old semantic HTML. No CSS. No images. No JavaScript.

The second incident was actually quite similar in that it also involved SSL, and specifically the expiration of an SSL certificate being used by jQuery?s CDN. If a site relied on that CDN to serve an HTTPS-hosted version of jQuery, their users wouldn?t have received it. And if that site was dependent on jQuery to be usable ? well, ouch!

For what it?s worth, this isn?t the first time incidents like these have occurred. Only a few short years ago, Sky Broadband?s parental filter dramatically miscategorized the jQuery CDN as a source of malware. With that designation in place, they spent the better part of a day blocking all requests for resources on that domain, affecting nearly all of their customers.

It can be easy to shrug off news like this. Surely we?d make smarter implementation decisions if we were in charge. We?d certainly have included a local copy of jQuery like the good Boilerplate tells us to. The thing is, even with that extra bit of protection in place, we?re falling for one of the most attractive fallacies when it comes to building for the web: that we have control.

Lost in transit?

There are some things we do control on the web, but they may be fewer than you think. As a solo dev or team lead, we have considerable control over the HTML, CSS, and JavaScript code that ultimately constructs our sites. Same goes for the tools we use and the hosting solutions we?ve chosen. Of course, that control lessens on large teams or when others are calling the shots, though in those situations we still have an awareness of the coding conventions, tooling, and hosting environment we?re working with. Once our carefully-crafted code leaves our servers, however, all bets are off.

First off, we don?t?at least in the vast majority of cases?control the network our code traverses to reach our users. Ideally our code takes an optimized path so that it reaches its destination quickly, yet any one of the servers along that path can read and manipulate the code. If you?ve heard of ?man-in-the-middle? attacks, this is how they happen.

For example, certain providers have no qualms about injecting their own advertising into your pages. Gross, right? HTTPS is one way to stop this from happening (and to prevent servers from being able to snoop on our traffic), but some providers have even found a way around that. Sigh.

Lost in translation?

Assuming no one touches our code in transit, the next thing standing between our users and our code is the browser. These applications are the gateways to (and gatekeepers of) the experiences we build on the web. And, even though the last decade has seen browser vendors coalesce around web standards, there are still differences to consider. Those differences are yet another factor that will make or break the experience our users have.

While every browser vendor supports the idea and ongoing development of standards, they do so at their own pace and very much in relation to their business interests. They prioritize features that help them meet their own goals and can sometimes be reluctant or slow to implement new features. Occasionally, as happened with CSS Grid, everyone gets on board rather quickly, and we can see a new spec go from draft to implementation within a single calendar year. Others, like Service Worker, can take hold quickly in a handful of browsers but take longer to roll out in others. Still others, like Pointer Events, might get implemented widely, only to be undermined by one browser?s indifference.

All of this is to say that the browser landscape is much like the Great Plains of the American Midwest: from afar it looks very even, but walking through it we?re bound to stumble into a prairie dog burrow or two. And to successfully navigate the challenges posed by the browser environment, it pays to get familiar with where those burrows lie so we don?t lose our footing. Object detection ? font stacks ? media queries ? feature detection ? these tools (and more) help us ensure our work doesn?t fall over in less-than-ideal situations.

Beyond standards support, it?s important to recognize that some browsers include optimizations that can affect the delivery of your code. Opera Mini and Amazon?s Silk are examples of the class of browser often referred to as proxy browsers. Proxy browsers, as their name implies, position their own proxy servers in between our domains and the end user. They use these servers to do things like optimize images, simplify markup, and jettison unsupported JavaScript in the interest of slimming the download size of our pages. Proxy browsers can be a tremendous help for users paying for downloads by the bit, especially given our penchant for increasing web page sizes year upon year.

If we don?t consider how these browsers can affect our pages, our site may simply collapse and splay its feet in the air like a fainting goat. Consider this JavaScript taken from an example I threw up on Codepen:

document.body.innerHTML += '<p>Can I count to four?</p>'; for (let i=1; i

This code is designed to insert several paragraphs into the current document and, when executed, produces this:

Can I count to four? 1 2 3 4 Success!

Simple enough, right? Well, yes and no. You see, this code makes use of the let keyword, which was introduced in ECMAScript 2015 (a.k.a. ES6) to enable block-level variable scoping. It will work a treat in browsers that understand let. However, any browsers that don?t understand let will have no idea what to make of it and won?t execute any of the JavaScript?not even the parts they do understand?because they don?t know how to interpret the program. Users of Opera Mini, Internet Explorer 10, QQ, and Safari 9 would get nothing.

This is a relatively simplistic example, but it underscores the fragility of JavaScript. The UK?s GDS ran a study to determine how many of their users didn?t get JavaScript enhancements and discovered that 0.9% of their users who should have received them?in other words, their browser supported JavaScript and they had not turned it off?didn?t for some reason. Add in the 0.2% of users whose browsers did not support JavaScript or who had turned it off, and the total non-JavaScript constituency was 1.1%, or 1 in every 93 people who visit their site.

It?s worth keeping in mind that browsers must understand the entirety of our JavaScript before they can execute it. This may not be a big deal if we write all of our own JavaScript (though we all occasionally make mistakes), but it becomes a big deal when we include third-party code like JavaScript libraries, advertising code, or social media buttons. Errors in any of those codebases can cause problems for our users.

Browser plugins are another form of third-party code that can negatively affect our sites. And they?re ones we don?t often consider. Back in the early ?00s, I remember spending hours trying to diagnose a site issue reported by one of my clients, only to discover it only occurred when using a particular plugin. Anger and self-doubt were wreaking havoc on me as I failed time and time again to reproduce the error my client was experiencing. It took me traveling the two hours to her office and sitting down at her desk to discover the difference between her setup and mine: a third-party browser toolbar.

We don?t have the luxury of traveling to our users? homes and offices to determine if and when a browser plugin is hobbling our creations. Instead, the best defense against the unknowns of the browsing environment is to always design our sites with a universally usable baseline.

Lost in interpretation?

Regardless of everything discussed so far, when our carefully crafted website finally reaches its destination, it has one more potential barrier to success: us. Specifically, our users. More broadly, people. Unless our product is created solely for the consumption of some other life form or machine, we?ve got to consider the ultimate loss of control when we cede it to someone else.

Over the course of my twenty years of building websites for customers, I?ve always had the plaintive voice of Clerks? Randal Graves in the back of my head: ?This job would be great if it wasn’t for the f?ing customers.? I?m not happy about that. It?s an arrogant position (surely), yet an easy one to lapse into.

People are so needy. Wouldn?t it be great if we could just focus on ourselves?

No, that wouldn?t be good at all.

When we design and build for people like us, we exclude everyone who isn?t like us. And that?s most people. I?m going to put on my business hat here?Fedora? Bowler? Top hat??and say that artificially limiting our customer base is probably not in our company?s best interest. Not only will it limit our potential revenue growth, it could actually reduce our income if we become the target of a legal complaint by an excluded party.

Our efforts to build robust experiences on the web must account for the actual people that use them (or may want to use them). That means ensuring our sites work for people who experience motor impairments, vision impairments, hearing impairments, vestibular disorders, and other things we aggregate under the heading of ?accessibility.? It also means ensuring our sites work well for users in a variety of contexts: on large screens, small screens, even in-between screens. Via mouse, keyboard, stylus, finger, and even voice. In dark, windowless offices, glass-walled conference rooms, and out in the midday sun. Over blazingly fast fiber and painfully slow cellular networks. Wherever people are, however they access the web, whatever special considerations need to be made to accommodate them ? we should build our products to support them.

That may seem like a tall order, but consider this: removing access barriers for one group has a far-reaching ripple effect that benefits others. The roadside curb cut is an example we often cite. It was originally designed for wheelchair access, but stroller-pushing parents, children on bicycles, and even that UPS delivery person hauling a tower of Amazon boxes down Seventh Avenue all benefit from that rather simple consideration.

Maybe you?re more of a numbers person. If so, consider designing your interface such that it?s easier to use by someone who only has use of one arm. Every year, about 26,000 people in the U.S. permanently lose the use of an upper extremity. That?s a drop in the bucket compared to an overall population of nearly 326 million people. But that?s a permanent impairment. There are two other forms of impairment to consider: temporary and situational. Breaking your arm can mean you lose use of that hand?maybe your dominant one?for a few weeks. About 13 million Americans suffer an arm injury like this every year. Holding a baby is a situational impairment in that you can put it down and regain use of your arm, but the feasibility of that may depend greatly on the baby?s temperament and sleep schedule. About 8 million Americans welcome this kind of impairment?sweet and cute as it may be?into their home each year, and this particular impairment can last for over a year. All of this is to say that designing an interface that?s usable with one hand (or via voice) can help over 21 million more Americans (about 6% of the population) effectively use your service.

Finally, and in many ways coming full circle, there?s the copy we employ. Clear, well-written, and appropriate copy is the bedrock of great experiences on the web. When we draft copy, we should do so with a good sense of how our users talk to one another. That doesn?t mean we should pepper our legalese with slang, but it does mean we should author copy that is easily understood. It should be written at an appropriate reading level, devoid of unnecessary jargon and idioms, and approachable to both native and non-native speakers alike. Nestled in the gentle embrace of our (hopefully) semantic, server-rendered HTML, the copy we write is one of the only experiences of our sites we can pretty much guarantee our users will have.

Old advice, still relevant

Recognizing all of the ways our carefully-crafted experiences can be rendered unusable can be more than a little disheartening. No one likes to spend their time thinking about failure. So don?t. Don?t focus on all of the bad things you can?t control. Focus on what you can control.

Start simply. Code defensively. User-test the heck out of it. Recognize the chaos. Embrace it. And build resilient web experiences that will work no matter what the internet throws at them.


Designing for Research

20 Mar 2018 at 6:09am

If you’ve spent enough time developing for the web, this piece of feedback has landed in your inbox since time immemorial:

?This photo looks blurry. Can we replace it with a better version??

Every time this feedback reaches me, I’m inclined to question it: ?What about the photo looks bad to you, and can you tell me why??

That’s a somewhat unfair question to counter with. The complaint is rooted in a subjective perception of image quality, which in turn is influenced by many factors. Some are technical, such as the export quality of the image or the compression method (often lossy, as is the case with JPEG-encoded photos). Others are more intuitive or perceptual, such as content of the image and how compression artifacts mingle within. Perhaps even performance plays a role we’re not entirely aware of.

Fielding this kind of feedback for many years eventually lead me to design and develop an image quality survey, which was my first go at building a research project on the web. I started with twenty-five photos shot by a professional photographer. With them, I generated a large pool of images at various quality levels and sizes. Images were served randomly from this pool to users who were asked to rate what they thought about their quality.

Results from the first round were interesting, but not entirely clear: users seemed to have a tendency to overestimate the actual quality of images, and poor performance appeared to have a negative impact on perceptions of image quality, but this couldn’t be stated conclusively. A number of UX and technical issues made it necessary to implement important improvements and conduct a second round of research. In lieu of spinning my wheels trying to extract conclusions from the first round results, I decided it would be best to improve the survey as much as possible, and conduct another round of research to get better data. This article chronicles how I first built the survey, and then how I subsequently listened to user feedback to improve it.

Defining the research

Of the subjects within web performance, image optimization is especially vast. There’s a wide array of formats, encodings, and optimization tools, all of which are designed to make images small enough for web use while maintaining reasonable visual quality. Striking the balance between speed and quality is really what image optimization is all about.

This balance between performance and visual quality prompted me to consider how people perceive image quality. Lossy image quality, in particular. Eventually, this train of thought lead to a series of questions spurring the design and development of an image quality perception survey. The idea of the survey is that users are providing subjective assessments on quality. This is done by asking participants to rate images without an objective reference for what’s ?perfect.? This is, after all, how people view images in situ.

A word on surveys

Any time we want to quantify user behavior, it’s inevitable that a survey is at least considered, if not ultimately chosen to gather data from a group of people. After all, surveys are perfect when your goal is to get something measurable. However, the survey is a seductively dangerous tool, as Erika Hall cautions. They’re easy to make and conduct, and are routinely abused in their dissemination. They’re not great tools for assessing past behavior. They’re just as bad (if not worse) at predicting future behavior. For example, the 1?10 scale often employed by customer satisfaction surveys don’t really say much of anything about how satisfied customers actually are or how likely they’ll be to buy a product in the future.

The unfortunate reality, however, is that in lieu of my lording over hundreds of participants in person, the survey is the only truly practical tool I have to measure how people perceive image quality as well as if (and potentially how) performance metrics correlate to those perceptions. When I designed the survey, I kept with the following guidelines:

Don’t ask participants about anything other than what their perceptions are in the moment. By the time a participant has moved on, their recollection of what they just did rapidly diminishes as time elapses. Don’t assume participants know everything you do. Guide them with relevant copy that succinctly describes what you expect of them. Don’t ask participants to provide assessments with coarse inputs. Use an input type that permits them to finely assess image quality on a scale congruent with the lossy image quality encoding range.

All we can do going forward is acknowledge we’re interpreting the data we gather under the assumption that participants are being truthful and understand the task given to them. Even if the perception metrics are discarded from the data, there are still some objective performance metrics gathered that could tell a compelling story. From here, it’s a matter of defining the questions that will drive the research.

Asking the right questions

In research, you’re seeking answers to questions. In the case of this particular effort, I wanted answers to these questions:

How accurate are people’s perceptions of lossy image quality in relation to actual quality? Do people perceive the quality of JPEG images differently than WebP images? Does performance play a role in all of this?

These are important questions. To me, however, answering the last question was the primary goal. But the road to answers was (and continues to be) a complex journey of design and development choices. Let’s start out by covering some of the tech used to gather information from survey participants.

Sniffing out device and browser characteristics

When measuring how people perceive image quality, devices must be considered. After all, any given device’s screen will be more or less capable than others. Thankfully, HTML features such as srcset and picture are highly appropriate for delivering the best image for any given screen. This is vital because one’s perception of image quality can be adversely affected if an image is ill-fit for a device’s screen. Conversely, performance can be negatively impacted if an exceedingly high-quality (and therefore behemoth) image is sent to a device with a small screen. When sniffing out potential relationships between performance and perceived quality, these are factors that deserve consideration.

With regard to browser characteristics and conditions, JavaScript gives us plenty of tools for identifying important aspects of a user’s device. For instance, the currentSrc property reveals which image is being shown from an array of responsive images. In the absence of currentSrc, I can somewhat safely assume support for srcset or picture is lacking, and fall back to the img tag’s src value:

const surveyImage = document.querySelector(".survey-image"); let loadedImage = surveyImage.currentSrc || surveyImage.src;

Where screen capability is concerned, devicePixelRatio tells us the pixel density of a given device’s screen. In the absence of devicePixelRatio, you may safely assume a fallback value of 1:

let dpr = window.devicePixelRatio || 1;

devicePixelRatio enjoys excellent browser support. Those few browsers that don’t support it (i.e., IE 10 and under) are highly unlikely to be used on high density displays.

The stalwart getBoundingClientRect method retrieves the rendered width of an img element, while the HTMLImageElement interface’s complete property determines whether an image has finished loaded. The latter of these two is important, because it may be preferable to discard individual results in situations where images haven’t loaded.

In cases where JavaScript isn’t available, we can’t collect any of this data. When we collect ratings from users who have JavaScript turned off (or are otherwise unable to run JavaScript), I have to accept there will be gaps in the data. The basic information we’re still able to collect does provide some value.

Sniffing for WebP support

As you’ll recall, one of the initial questions asked was how users perceived the quality of WebP images. The HTTP Accept request header advertises WebP support in browsers like Chrome. In such cases, the Accept header might look something like this:

Accept: image/webp,image/apng,image/*,*/*;q=0.8

As you can see, the WebP content type of image/webp is one of the advertised content types in the header content. In server-side code, you can check Accept for the image/webp substring. Here’s how that might look in Express back-end code:

const WebP = req.get("Accept").indexOf("image/webp") !== -1 ? true : false;

In this example, I’m recording the browser’s WebP support status to a JavaScript constant I can use later to modify image delivery. I could use the picture element with multiple sources and let the browser figure out which one to use based on the source element’s type attribute value, but this approach has clear advantages. First, it’s less markup. Second, the survey shouldn’t always choose a WebP source simply because the browser is capable of using it. For any given survey specimen, the app should randomly decide between a WebP or JPEG image. Not all participants using Chrome should rate only WebP images, but rather a random smattering of both formats.

Recording performance API data

You’ll recall that one of the earlier questions I set out to answer was if performance impacts the perception of image quality. At this stage of the web platform’s development, there are several APIs that aid in the search for an answer:

Navigation Timing API (Level 2): This API tracks performance metrics for page loads. More than that, it gives insight into specific page loading phases, such as redirect, request and response time, DOM processing, and more. Navigation Timing API (Level 1): Similar to Level 2 but with key differences. The timings exposed by Level 1 of the API lack the accuracy as those in Level 2. Furthermore, Level 1 metrics are expressed in Unix time. In the survey, data is only collected from Level 1 of the API if Level 2 is unsupported. It’s far from ideal (and also technically obsolete), but it does help fill in small gaps. Resource Timing API: Similar to Navigation Timing, but Resource Timing gathers metrics on various loading phases of page resources rather than the page itself. Of the all the APIs used in the survey, Resource Timing is used most, as it helps gather metrics on the loading of the image specimen the user rates. Server Timing: In select browsers, this API is brought into the Navigation Timing Level 2 interface when a page request replies with a Server-Timing response header. This header is open-ended and can be populated with timings related to back-end processing phases. This was added to round two of the survey to quantify back-end processing time in general. Paint Timing API: Currently only in Chrome, this API reports two paint metrics: first paint and first contentful paint. Because a significant slice of users on the web use Chrome, we may be able to observe relationships between perceived image quality and paint metrics.

Using these APIs, we can record performance metrics for most participants. Here’s a simplified example of how the survey uses the Resource Timing API to gather performance metrics for the loaded image specimen:

// Get information about the loaded image const surveyImageElement = document.querySelector(".survey-image"); const fullImageUrl = surveyImageElement.currentSrc || surveyImageElement.src; const imageUrlParts = fullImageUrl.split("/"); const imageFilename = imageUrlParts[imageUrlParts.length - 1]; // Check for performance API methods if ("performance" in window && "getEntriesByType" in performance) { // Get entries from the Resource Timing API let resources = performance.getEntriesByType("resource"); // Ensure resources were returned if (typeof resources === "object" && resources.length > 0) { resources.forEach((resource) => { // Check if the resource is for the loaded image if (resource.name.indexOf(imageFilename) !== -1) { // Access resource images for the image here } }); } }

If the Resource Timing API is available, and the getEntriesByType method returns results, an object with timings is returned, looking something like this:

{ connectEnd: 1156.5999999947962, connectStart: 1156.5999999947962, decodedBodySize: 11110, domainLookupEnd: 1156.5999999947962, domainLookupStart: 1156.5999999947962, duration: 638.1000000037602, encodedBodySize: 11110, entryType: "resource", fetchStart: 1156.5999999947962, initiatorType: "img", name: "https://imagesurvey.site/img-round-2/1-1024w-c2700e1f2c4f5e48f2f57d665b1323ae20806f62f39c1448490a76b1a662ce4a.webp", nextHopProtocol: "h2", redirectEnd: 0, redirectStart: 0, requestStart: 1171.6000000014901, responseEnd: 1794.6999999985565, responseStart: 1737.0999999984633, secureConnectionStart: 0, startTime: 1156.5999999947962, transferSize: 11227, workerStart: 0 }

I grab these metrics as participants rate images, and store them in a database. Down the road when I want to write queries and analyze the data I have, I can refer to the Processing Model for the Resource and Navigation Timing APIs. With SQL and data at my fingertips, I can measure the distinct phases outlined by the model and see if correlations exist.

Having discussed the technical underpinnings of how data can be collected from survey participants, let’s shift the focus to the survey’s design and user flows.

Designing the survey

Though surveys tend to have straightforward designs and user flows relative to other sites, we must remain cognizant of the user’s path and the impediments a user could face.

The entry point

When participants arrive at the home page, we want to be direct in our communication with them. The home page intro copy greets participants, gives them a succinct explanation of what to expect, and presents two navigation choices:

From here, participants either start the survey or read a privacy policy. If the user decides to take the survey, they’ll reach a page politely asking them what their professional occupation is and requesting them to disclose any eyesight conditions. The fields for these questions can be left blank, as some may not be comfortable disclosing this kind of information. Beyond this point, the survey begins in earnest.

The survey primer

Before the user begins rating images, they’re redirected to a primer page. This page describes what’s expected of participants, and explains how to rate images. While the survey is promoted on design and development outlets where readers regularly work with imagery on the web, a primer is still useful in getting everyone on the same page. The first paragraph of the page stresses that users are rating image quality, not image content. This is important. Absent any context, participants may indeed rate images for their content, which is not what we’re asking for. After this clarification, the concept of lossy image quality is demonstrated with the following diagram:

Lastly, the function of the rating input is explained. This could likely be inferred by most, but the explanatory copy helps remove any remaining ambiguity. Assuming your user knows everything you do is not necessarily wise. What seems obvious to one is not always so to another.

The image specimen page

This page is the main event and is where participants assess the quality of images shown to them. It contains two areas of focus: the image specimen and the input used to rate the image’s quality.

Let’s talk a bit out of order and discuss the input first. I mulled over a few options when it came to which input type to use. I considered a select input with coarsely predefined choices, an input with a type of number, and other choices. What seemed to make the most sense to me, however, was a slider input with a type of range.

A slider input is more intuitive than a text input, or a select element populated with various choices. Because we’re asking for a subjective assessment about something with such a large range of interpretation, a slider allows participants more granularity in their assessments and lends further accuracy to the data collected.

Now let’s talk about the image specimen and how it’s selected by the back-end code. I decided early on in the survey’s development that I wanted images that weren’t prominent in existing stock photo collections. I also wanted uncompressed sources so I wouldn’t be presenting participants with recompressed image specimens. To achieve this, I procured images from a local photographer. The twenty-five images I settled on were minimally processed raw images from the photographer’s camera. The result was a cohesive set of images that felt visually related to each other.

To properly gauge perception across the entire spectrum of quality settings, I needed to generate each image from the aforementioned sources at ninety-six different quality settings ranging from 5 to 100. To account for the varying widths and pixel densities of screens in the wild, each image also needed to be generated at four different widths for each quality setting: 1536, 1280, 1024, and 768 pixels, to be exact. Just the job srcset was made for!

To top it all off, images also needed to be encoded in both JPEG and WebP formats. As a result, the survey draws randomly from 768 images per specimen across the entire quality range, while also delivering the best image for the participant’s screen. This means that across the twenty-five image specimens participants evaluate, the survey draws from a pool of 19,200 images total.

With the conception and design of the survey covered, let’s segue into how the survey was improved by implementing user feedback into the second round.

Listening to feedback

When I launched round one of the survey, feedback came flooding in from designers, developers, accessibility advocates, and even researchers. While my intentions were good, I inevitably missed some important aspects, which made it necessary to conduct a second round. Iteration and refinement are critical to improving the usefulness of a design, and this survey was no exception. When we improve designs with user feedback, we take a project from average to something more memorable. Getting to that point means taking feedback in stride and addressing distinct, actionable items. In the case of the survey, incorporating feedback not only yielded a better user experience, it improved the integrity of the data collected.

Building a better slider input

Though the first round of the survey was serviceable, I ran into issues with the slider input. In round one of the survey, that input looked like this:

There were two recurring complaints regarding this specific implementation. The first was that participants felt they had to align their rating to one of the labels beneath the slider track. This was undesirable for the simple fact that the slider was chosen specifically to encourage participants to provide nuanced assessments.

The second complaint was that the submit button was disabled until the user interacted with the slider. This design choice was intended to prevent participants from simply clicking the submit button on every page without rating images. Unfortunately, this implementation was unintentionally hostile to the user and needed improvement, because it blocked users from rating images without a clear and obvious explanation as to why.

Fixing the problem with the labels meant redesigning the slider as it appeared in Figure 3. I removed the labels altogether to eliminate the temptation of users to align their answers to them. Additionally, I changed the slider background property to a gradient pattern, which further implied the granularity of the input.

The submit button issue was a matter of how users were prompted. In round one the submit button was visible, yet the disabled state wasn’t obvious enough to some. After consulting with a colleague, I found a solution for round two: in lieu of the submit button being initially visible, it’s hidden by some guide copy:

Once the user interacts with the slider and rates the image, a change event attached to the input fires, which hides the guide copy and replaces it with the submit button:

This solution is less ambiguous, and it funnels participants down the desired path. If someone with JavaScript disabled visits, the guide copy is never shown, and the submit button is immediately usable. This isn’t ideal, but it doesn’t shut out participants without JavaScript.

Addressing scrolling woes

The survey page works especially well in portrait orientation. Participants can see all (or most) of the image without needing to scroll. In browser windows or mobile devices in landscape orientation, however, the survey image can be larger than the viewport:

Working with such limited vertical real estate is tricky, especially in this case where the slider needs to be fixed to the bottom of the screen (which addressed an earlier bit of user feedback from round one testing). After discussing the issue with colleagues, I decided that animated indicators in the corners of the page could signal to users that there’s more of the image to see.

When the user hits the bottom of the page, the scroll indicators disappear. Because animations may be jarring for certain users, a prefers-reduced-motion media query is used to turn off this (and all other) animations if the user has a stated preference for reduced motion. In the event JavaScript is disabled, the scrolling indicators are always hidden in portrait orientation where they’re less likely to be useful and always visible in landscape where they’re potentially needed the most.

Avoiding overscaling of image specimens

One issue that was brought to my attention from a coworker was how the survey image seemed to expand boundlessly with the viewport. On mobile devices this isn’t such a problem, but on large screens and even modestly sized high-density displays, images can be scaled excessively. Because the responsive img tag’s srcset attribute specifies a maximum resolution image of 1536w, an image can begin to overscale at as ?small? at sizes over 768 pixels wide on devices with a device pixel ratio of 2.

Some overscaling is inevitable and acceptable. However, when it’s excessive, compression artifacts in an image can become more pronounced. To address this, the survey image’s max-width is set to 1536px for standard displays as of round two. For devices with a device pixel ratio of 2 or higher, the survey image’s max-width is set to half that at 768px:

This minor (yet important) fix ensures that images aren’t scaled beyond a reasonable maximum. With a reasonably sized image asset in the viewport, participants will assess images close to or at a given image asset’s natural dimensions, particularly on large screens.

User feedback is valuable. These and other UX feedback items I incorporated improved both the function of the survey and the integrity of the collected data. All it took was sitting down with users and listening to them.

Wrapping up

As round two of the survey gets under way, I’m hoping the data gathered reveals something exciting about the relationship between performance and how people perceive image quality. If you want to be a part of the effort, please take the survey. When round two concludes, keep an eye out here for a summary of the results!

Thank you to those who gave their valuable time and feedback to make this article as good as it could possibly be: Aaron Gustafson, Jeffrey Zeldman, Brandon Gregory, Rachel Andrew, Bruce Hyslop, Adrian Roselli, Meg Dickey-Kurdziolek, and Nick Tucker.

Additional thanks to those who helped improve the image quality survey: Mandy Tensen, Darleen Denno, Charlotte Dann, Tim Dunklee, and Thad Roe.


A DIY Web Accessibility Blueprint

13 Mar 2018 at 6:18am

The summer of 2017 marked a monumental victory for the millions of Americans living with a disability. On June 13th, a Southern District of Florida Judge ruled that Winn-Dixie?s inaccessible website violated Title III of the Americans with Disabilities Act. This case marks the first trial under the ADA, which was passed into law in 1990.

Despite spending more than $7 million to revamp its website in 2016, Winn-Dixie neglected to include design considerations for users with disabilities. Some of the features that were added include online prescription refills, digital coupons, rewards card integration, and a store locator function. However, it appears that inclusivity didn?t make the cut.

Because Winn-Dixie?s new website wasn?t developed to WCAG 2.0 standards, the new features it boasted were in effect only available to sighted, able-bodied users. When Florida resident Juan Carlos Gil, who is legally blind, visited the Winn-Dixie website to refill his prescriptions, he found it to be almost completely inaccessible using the same screen reader software he uses to access hundreds of other sites.

Juan stated in his original complaint that he ?felt as if another door had been slammed in his face.? But Juan wasn?t alone. Intentionally or not, Winn-Dixie was denying an entire group of people access to their new website and, in turn, each of the time-saving features it had to offer.

What makes this case unique is that it marks the first time in history in which a public accommodations case went to trial, meaning the judge ruled the website to be a ?place of public accommodation? under the ADA and therefore subject to ADA regulations. Since there are no specific ADA regulations regarding the internet, Judge Scola decided the adoption of the Web Content Accessibility Guidelines (WCAG) 2.0 Level AA to be appropriate. (Thanks to the hard work of the Web Accessibility Initiative (WAI) at the W3C, WCAG 2.0 has found widespread adoption throughout the globe, either as law or policy.)

Learning to have empathy

Anyone with a product subscription service (think diapers, razors, or pet food) knows the feeling of gratitude that accompanies the delivery of a much needed product that arrives just in the nick of time. Imagine how much more grateful you?d be for this service if you, for whatever reason, were unable to drive and lived hours from the nearest store. It?s a service that would greatly improve your life. But now imagine that the service gets overhauled and redesigned in such a way that it is only usable by people who own cars. You?d probably be pretty upset.

This subscription service example is hypothetical, yet in the United States, despite federal web accessibility requirements instituted to protect the rights of disabled Americans, this sort of discrimination happens frequently. In fact, anyone assuming the Winn-Dixie case was an isolated incident would be wrong. Web accessibility lawsuits are rising in number. The increase from 2015 to 2016 was 37%. While some of these may be what’s known as “drive-by lawsuits,” many of them represent plaintiffs like Juan Gil who simply want equal rights. Scott Dinin, Juan?s attorney, explained, ?We’re not suing for damages. We’re only suing them to follow the laws that have been in this nation for twenty-seven years.?

For this reason and many others, now is the best time to take a proactive approach to web accessibility. In this article I?ll help you create a blueprint for getting your website up to snuff.

The accessibility blueprint

If you?ll be dealing with remediation, I won?t sugarcoat it: successfully meeting web accessibility standards is a big undertaking, one that is achieved only when every page of a site adheres to all the guidelines you are attempting to comply with. As I mentioned earlier, those guidelines are usually WCAG 2.0 Level AA, which means meeting every Level A and AA requirement. Tight deadlines, small budgets, and competing priorities may increase the stress that accompanies a web accessibility remediation project, but with a little planning and research, making a website accessible is both reasonable and achievable.

My intention is that you may use this article as a blueprint to guide you as you undertake a DIY accessibility remediation project. Before you begin, you?ll need to increase your accessibility know-how, familiarize yourself with the principles of universal design, and learn about the benefits of an accessible website. Then you may begin to evangelize the benefits of web accessibility to those you work with.

Have the conversation with leadership

Securing support from company leadership is imperative to the long-term success of your efforts. There are numerous ways to broach the subject of accessibility, but, sadly, in the world of business, substantiated claims top ethics and moral obligation. Therefore I?ve found one of the most effective ways to build a business case for web accessibility is to highlight the benefits.

Here are just a few to speak of:

Accessible websites are inherently more usable, and consequently they get more traffic. Additionally, better user experiences result in lower bounce rates, higher conversions, and less negative feedback, which in turn typically make accessible websites rank higher in search engines. Like assistive technology, web crawlers (such as Googlebot) leverage HTML to get their information from websites, so a well marked-up, accessible website is easier to index, which makes it easier to find in search results. There are a number of potential risks for not having an accessible website, one of which is accessibility lawsuits. Small businesses in the US that improve the accessibility of their website may be eligible for a tax credit from the IRS. Start the movement

If you can?t secure leadership backing right away, you can still form a grassroots accessibility movement within the company. Begin slowly and build momentum as you work to improve usability for all users. Though you may not have the authority to make company-wide changes, you can strategically and systematically lead the charge for web accessibility improvements.

My advice is to start small. For example, begin by pushing for site-wide improvements to color contrast ratios (which would help color-blind, low-vision, and aging users) or work on making the site keyboard accessible (which would help users with mobility impairments or broken touchpads, and people such as myself who prefer not using a mouse whenever possible). Incorporate user research and A/B testing into these updates, and document the results. Use the results to champion for more accessibility improvements.

Read and re-read the guidelines

Build your knowledge base as you go. Learning which laws, rules, or guidelines apply to you, and understanding them, is a prerequisite to writing an accessibility plan. Web accessibility guidelines vary throughout the world. There may be other guidelines that apply to you, and in some cases, additional rules, regulations, or mandates specific to your industry.

Not understanding which rules apply to you, not reading them in full, or not understanding what they mean can create huge problems down the road, including excessive rework once you learn you need to make changes.

Build a team

Before you can start remediating your website, you?ll need to assemble a team. The number of people will vary depending on the size of your organization and website. I previously worked for a very large company with a very large website, yet the accessibility team they assembled was small in comparison to the thousands of pages we were tasked to remediate. This team included a project manager, visual designers, user experience designers, front-end developers, content editors, a couple requirements folks, and a few QA testers. Most of these people had been pulled from their full-time roles and instructed to quickly become familiar with WCAG 2.0. To help you create your own accessibility team, I will explain in detail some of the top responsibilities of the key players:

Project manager is responsible for coordinating the entire remediation process. They will help run planning sessions, keep everyone on schedule, and report the progress being made. Working closely with the requirements people, their goal is to keep every part of this new machine running smoothly. Visual designers will mainly address issues of color usage and text alternatives. In its present form, WCAG 2.0 contrast minimums only apply to text, however the much anticipated WCAG 2.1 update (due to be released in mid-2018) contains a new success criterion for Non-text Contrast, which covers contrast minimums of all interactive elements and ?graphics required to understand the content.? Visual designers should also steer clear of design trends that ruin usability. UX designers should be checking for consistent, logical navigation and reading order. They?ll need to test that pages are using heading tags appropriately (headings are for semantic structure, not for visual styling). They?ll be checking to see that page designs are structured to appear and operate in predictable ways. Developers have the potential to make or break an accessible website because even the best designs will fail if implemented incorrectly. If your developers are unfamiliar with WAI-ARIA, accessible coding practices, or accessible JavaScript, then they have a few things to learn. Developers should think of themselves as designers because they play a very important role in designing an inclusive user experience. Luckily, Google offers a short, free Introduction to Web Accessibility course and, via Udacity, a free, advanced two-week accessibility course. Additionally, The A11Y Project is a one-stop shop loaded with free pattern libraries, checklists, and accessibility resources for front-end developers. Editorial review the copy for verbosity. Avoid using phrases that will confuse people who aren?t native language speakers. Don?t ?beat around the bush? (see what I did there?). Keep content simple, concise, and easy to understand. No writing degree? No worries. There are apps that can help you improve the clarity of your writing and that correct your grammar like a middle school English teacher. Score bonus points by making sure link text is understandable out of context. While this is a WCAG 2.0 Level AAA guideline, it?s also easily fixed and it greatly improves the user experience for individuals with varying learning and cognitive abilities. Analysts work in tandem with editorial, design, UX, and QA. They coordinate the work being done by these groups and document the changes needed. As they work with these teams, they manage the action items and follow up on any outstanding tasks, questions, or requests. The analysts also deliver the requirements specifications to the developers. If the changes are numerous and complex, the developers may need the analysts to provide further clarification and to help them properly implement the changes as described in the specs. QA will need to be trained to the same degree as the other accessibility specialists since they will be responsible for testing the changes that are being made and catching any issues that arise. They will need to learn how to navigate a website using only a keyboard and also by properly using a screen reader (ideally a variety of screen readers). I emphasized ?properly? because while anyone can download NVDA or turn on VoiceOver, it takes another level of skill to understand the difference between ?getting through a page? and ?getting through a page with standard keyboard controls.? Having individuals with visual, auditory, or mobility impairments on the QA team can be a real advantage, as they are more familiar with assistive technology and can test in tandem with others. Additionally, there are a variety of automated accessibility testing tools you can use alongside manual testing. These tools typically catch only around 30% of common accessibility issues, so they do not replace ongoing human testing. But they can be extremely useful in helping QA learn when an update has negatively affected the accessibility of your website. Start your engines!

Divide your task into pieces that make sense. You may wish to tackle all the global elements first, then work your way through the rest of the site, section by section. Keep in mind that every page must adhere to the accessibility standards you?re following for it to be deemed ?accessible.? (This includes PDFs.)

Use what you?ve learned so far by way of accessibility videos, articles, and guidelines to perform an audit of your current site. While some manual testing may seem difficult at first, you?ll be happy to learn that some manual testing is very simple. Regardless of the testing being performed, keep in mind that it should always be done thoroughly and by considering a variety of users, including:

keyboard users; blind users; color-blind users; low-vision users; deaf and hard-of-hearing users; users with learning disabilities and cognitive limitations; mobility-impaired users; users with speech disabilities; and users with seizure disorders. When you are in the weeds, document the patterns

As you get deep in the weeds of remediation, keep track of the patterns being used. Start a knowledge repository for elements and situations. Lock down the designs and colors, code each element to be accessible, and test these patterns across various platforms, browsers, screen readers, and devices. When you know the elements are bulletproof, save them in a pattern library that you can pull from later. Having a pattern library at your fingertips will improve consistency and compliance, and help you meet tight deadlines later on, especially when working in an agile environment. You?ll need to keep this online knowledge repository and pattern library up-to-date. It should be a living, breathing document.

Cross the finish line ? and keep going!

Some people mistakenly believe accessibility is a set-it-and-forget-it solution. It isn?t. Accessibility is an ongoing challenge to continually improve the user experience the way any good UX practitioner does. This is why it?s crucial to get leadership on board. Once your site is fully accessible, you must begin working on the backlogs of continuous improvements. If you aren?t vigilant about accessibility, people making even small site updates can unknowingly strip the site of the accessibility features you worked so hard to put in place. You?d be surprised how quickly it can happen, so educate everyone you work with about the importance of accessibility. When everyone working on your site understands and evangelizes accessibility, your chances of protecting the accessibility of the site are much higher.

It?s about the experience, not the law

In December of 2017, Winn-Dixie appealed the case with blind patron Juan Carlo Gil. Their argument is that a website does not constitute a place of accommodation, and therefore, their case should have been dismissed. This case, and others, illustrate that the legality of web accessibility is still very much in flux. However, as web developers and designers, our motivation to build accessible websites should have nothing to do with the law and everything to do with the user experience.

Good accessibility is good UX. We should seek to create the best user experience for all. And we shouldn?t settle for simply meeting accessibility standards but rather strive to create an experience that delights users of all abilities.

Additional resources and articles

If you are ready to learn more about web accessibility standards and become the accessibility evangelist on your team, here are some additional resources that can help.

Resources Interactive WCAG 2.0?an awesome full version of the WCAG 2.0 guidelines that allows you to filter success criteria by responsibility. tota11y?tota11y is an easy-to-use accessibility visualization tool from Khan Academy. The A11Y Project?a ton of libraries, checklists, and accessibility resources for front-end developers. Web Accessibility by Google: Developing with Empathy?a free two-week eLearning course that is geared toward experienced front-end developers. ?Top Twenty-Five Awesome Accessibility Testing Tools for Websites??a compiled list of twenty-five automated accessibility testing tools with a brief description of each one. Articles ?Why Designing for Accessibility Is Simply Good Business??lists seven business-savvy benefits of having an accessible website. ?Accessibility Is Part of UX (It Isn?t a Swear Word)??an awesome article that addresses how the separation of HTML and CSS affects navigation, layout, and more. ?Reframing Accessibility for the Web??addresses negative stereotypes, ableism, and how to integrate accessibility into your testing process. ?What Does Responsive Web Design Have to Do with Accessibility???discusses how responsive web design improves UX and accessibility. ?Ten Guidelines to Improve the Usability and Accessibility of Your Site??helps you identify the ?low-hanging fruit? of accessibility issues and shows you how to fix them.

[CaRP] php_network_getaddresses: getaddrinfo failed: Name or service not known (0)

[CaRP] XML error: Mismatched tag at line 9

[CaRP] php_network_getaddresses: getaddrinfo failed: Name or service not known (0)

[CaRP] XML error: Mismatched tag at line 6

[CaRP] php_network_getaddresses: getaddrinfo failed: Name or service not known (0)

[CaRP] php_network_getaddresses: getaddrinfo failed: Name or service not known (0)

[CaRP] php_network_getaddresses: getaddrinfo failed: Name or service not known (0)

[CaRP] XML error: Mismatched tag at line 6

Web Accessibility Book Marks


Resources | FAQ'S | Privacy Policy | Site Map
@2005 WebCottage Designs LLC, All rights reserved