HighEdWeb Day 2: Electric Boogaloo

Good time last night, you totally missed out.  I know there are possibly some incriminating pictures of me floating around in a wig with my Superman patch out for all to see.  I got bested by the Guru himself at the Guitar Hero contest.  Next year… I’m coming back Rocky style.  For those of you following from here, my planned course is to hit UAD1, SAC2, APS3, TPR4, TPR5, TPR6, APS7, APS8, UAD9, UAD10.

For those not here, UAD1 is about how higher education is responding to accessibility needs.  Answer for us: we’re not. Yet.  I’m more concerned with making sure crap works at all, heh.  Then 508 becomes my bedroom buddy.  Terrill Thompson of U of Washington is presenting.  Some fun stats: 600 million people in the world with disabilities, 52.5 million in the US, 1 million college students, and 3,025 complaints filed regarding disability related discrimination with the Department of Education.  He did not make clear if these were stats related directly to disabilities impeding access to web sites, or if the complaints were web related.

Update 8:47AM ~ Going over the basic issues of accessibility: no alt tags, fly out menus, page scaling, closed captioning, CAPTCHAs, etc.  Note about CAPTCHAs, has anyone else noticed audio CAPTCHAs seem to be getting a lot harder lately too.  Seems that way to me.

Measuring outcomes - target higher ed. pages and measure policies, procedures, and promising practices.  Too many P’s.  Choose a measure of accessibility from among the 14 guidelines of the WCAG in their 3 priority checkpoints.  Also Section 508 federal law based on WCAG Priority 1.

Update 8:58AM ~ Resources for testing accessibility: Bobby (Watchfire), CynthiaSays (HiSoftware), Functional Accessibility Evaluator (FAE).  FAE has 31 rules in 5 categories.  Best success seems to be in HTML standards and scripting.  Styling and text equivalents are also close around 50%.  Navigation and orientation is craptacular (36.07%).  We’re taking a look at the HighEdWeb home page to see how accessible it is.  It is very nice.  Good job guys.

Terrill did a study of 7239 higher ed home pages in 162 countries, as well as 5281 governmental pages in 181 countries.  They also did a manual assessment of 127 home pages over time in the Northwest US.  This was over 6 months and included varying levels of training.  Main improvements were in alt tags, forms, and skip navigation links.  Three places that got worse: keyboard accessibility, noscript content, and CSS disabling.  Those last two surprise me some.  This change over those six areas was more pronounced with those that were trained, so the easy stuff got easier, but the hard stuff got harder.

Update 9:15AM ~Note athenpro.org survey on Accessible Technology in Higher Education. Nice spread of location and college levels, though high concentration in the West with respect to US feedback.  Roughly half have web accessibility policies in place.  About 70% of people in the US have a person or office responsibly for accessibility.  That in particular seems high, especially based on the sampling in the room currently.  For accessibility when acquiring IT, 44% have a policy in the US.  Was accessibility a concern when purchasing an LMS? 44% do in the US.

So far, lots of stats, but little in the way of solutions or recommendations.  My consideration is that sometimes you have to get features before getting it polished, otherwise you’d never get new stuff on a campus web site, or if it’s between having an inaccessible product or no product at all, sometimes you take what you can get.  Being accessible shouldn’t mean short changing your whole community on tools. I’m not sure where the happy medium lies, and maybe that’s the point, is that it’s different everywhere.

UW added a .5 FTE for accessibility and added a couple accessibility web sites, one public, one internal for resource building.  They also held their own mini conference for campus stakeholders, which even included a couple vendors, to discuss how to tackle things.

Update 9:57AM ~One free Pepsi later I am preparing for SAC2, on producing a style guide for a university.  This is something I’m hoping to tackle before we get our new president sometime next year. Plus it will really help with twisting people’s arms to do things right.  I just realized with the ton of short sessions today and tomorrow will result in fairly long blogs.  I apologize.  This session is presented by a guy named Jesse.  I cannot his last name from here.  We’re defining value, process, and content, or why, how, and what of a style guide.

The main goal in consistency - improve professional image and legitimacy of content, as well as create predictability.  Main reason people started developing style guides was because everyone was doing their own thing and to get everyone on the same page.  This saves time and money, speeds up workflow by creating a baseline (this includes when you outsource work to contractors), and manages expectations.

Style guides are good reinforcement when you need to tell people “no.”  Shows decisions aren’t arbitrary, because they’re “in the style guide.”  Swinburne University and San Diego State are good examples of style guides.  Start with some info on how to do a corporate style guide, and adapt to your needs.

Update 10:14AM ~ Step 1: Managerial buy in.
Step 2: Form a committee (if you need to)
Step 3: Determine the audience (the end user of the document)
Step 4: Create the Content. Pick something like Chicago, AP, etc to base from.  Choose the right platform.  Work from your needs.

Lots of emphasis in guides on colors and palettes, fonts, and accessibility, as well as logos and templates.

Update 10:23AM ~ Step 5: Distribute it and get feedback.  I am packing up and prepping for the next session now, APS3.  See you there.

Update 10:57AM ~ Web Services for Web Services is presented by Kevin Bischof and Kat Hollowell of Xavier.  Their team is in IT, which they find beneficial, which is interesting because I found moving our team out of IT to be beneficial.  Just goes to show you different places work different ways.  Prioritize based on A, B, and C levels.  So far, a lot of overview on just how Xavier works (CMS, analytics, site features), mostly basic info.  They do, however, have a style guide, as a nod to the last session.  The host a “web week” where they do training sessions all week (19 sessions) of anything from basic HTML to photo editing, etc.  Neat idea, similar to the campus “conference” mentioned earlier.  They also “certify” webmasters, an idea I’ve toyed with in the past.

Update 11:23AM ~ Explaining the usage of a smarter 404 page.  You should explain why a user landed there, and provide index links and search, and perhaps a resource to submit the 404.  Xavier uses a “smart” 404 to help find where a user needs to go, first with redirects and a script to look for where they might have been going.  Their 404 submission process is automated, and uses an AJAX button that does it anonymously, and submits to a 404 database.  Then they do a lot using students for the cleanup.  Since January of 2008, they’ve gotten error reports down to 40 or 50 a month from 500.

Other ideas for what people are doing: tag clouds on search, popular searches, nightly link check.  Overall, pretty light on the info, was hoping for more information on how they handle dev requests and such for departments, and manage workflow of projects.  Asked the question, they are using SharePoint for project tracking.

Okay, lunchtime.  Be back later this afternoon.

Update 12:55PM ~ Settling in for the keynote in the theatre.  Sitting pretty much center of house, so my fancy extension cord fails me.  The Twitter backchannel is up on the huge projector.  I never looked so good as a huge avatar.  Following the keynote, watch for me in TPR4, the javascript:void() session.

The keynote is Jeffrey Veen, a smart web guy.  The topic is along the lines of tiny actions and behaviors and how that relates to a collective and emerging patterns.  We start in 1974, around the cultural end of the “60s”.  Start of the environmental movement. Nixon and trusted advisor Elvis.  KISS’s first album.  AT&T breakup begins.  Pong, the idea of controlling what is on the screen.  IBM releases first hard drive, the Winchester (30ms seek time and 30MB storage), at roughly $100,000/GB.  Google gives it out for about $0.15/GB per month.  Tools for participation + scale of data.

Update 1:22PM ~ Nice example of how the UI can impact the usability of a dataset, from a bunch of numbers to a table with labels, typography, and colorcoding.  Mind the line between communication and decoration.  Aside: The best presenters really know how to put a slideshow together.  Veen is a good case in point.  Going back to John Snow’s cholera research of 1854 as an example of data visualization as an example of demonstrating correlated fact.  Also Charles Joseph Minard in the 1860s.  The idea is to not make users think.  Let the user find the story independent of the stats.  More modern example is the London Underground tube system map.  Not at all representative of the actual path trains take, but conveys the important information in a meaningful manner.

Google Analytics data visualization in the trend charts inspired by Raiders of the Lost Ark.  Assign different visual cues to each dimension of your data, then remove everything that doesn’t tell the story of the data.  Print item = an artifact you have control over.  The web is a set of recommendations over how a page *should* look, but requires you giving up control.  Example: CSS Zen Garden.  Also consider RSS aggregators.  Give the users the tools to interface with content and data the way they want, let them find their own story in the data.

Update 1:44PM ~ Don’t try to determine what is important to people, let them decide.  Create filters to add clarity for them.  Not only do you not necessarily know the story in datasets ahead of time, you can’t necessarily predict how long it will run, or what new value will come from it later.  Storytelling leads to discovery, visual cues leads to interactivity, and editing leads to filtering.  “Math is easy.  Design is hard.”  Is it possible to do too much research, to a point that it gets in the way of innovation?  Know yourself, then understand your users.

Overall, excellent keynote, very fun to listen to, and very well assembled.  In closing, recommending Steven Johnson’s The Ghost Map. Also Edward Tufte (several books).  Ben Fry’s Visualizing Data

Update 2:13PM ~ In to TPR4 after indulging in a refreshing free Pepsi (#2), and swiping another one to take to the hotel room (#3).  I know, I’m sneaky like a ninja.  This session on avoiding javascript:void() is of particular interest to me, because our calendar is very JavaScript heavy, and working on the accessibility is a high priority for me.  This is presented by Jason Pitoniak of RIT.

Accessible content doesn’t just apply to people with disabilities.  Think about mobile devices, smart devices, smaller screens, lack of capabilities, users with less experience, etc.  “Code like it’s 1999,” then use progressive enhancement on the client side.  Level the playing field with JS libraries.  Ensure things like basic functionality exists, even if it isn’t great looking.  For instance, on a form, enable all fields by default, filter on the server side, but use client side code to selectively disable fields. Libraries work to do things like eliminate browser irregularities from the get go.  Additionally add support for “missing” events.  Generally JS libraries just makes things easier.

WAI-ARIA is trying to solve the two biggest issues with respect to accessibility: runtime changes that can’t be seen by assistive technologies, and users that don’t understand the intended meaning of objects on the page.  This is done by telling the browser the intended markup which can be passed to the assistive technology and assigned role attributes.  States then define the actual current status of an element (checked, unchecked, etc).  I’m wondering how all these states and roles work with properly validated XHTML, since they seem to be adding in extra markup to tags.  I’ll try to sneak in the question in a bit.  ARIA based live regions are spaces on a page identified as likely to change when not currently focused.  Keep in mind, ARIA is currently only in a draft state.  FF3 does support it though, and he just answered my other question, it’s not supported in (X)HTML.

This is really much more of an accessibility over with respect to ARIA more than anything else. Not technical propeller hatty enough for me.  I thought we were actually going to go over examples of how to get around javascript linking.  Oh well.

Update 3:36PM ~ Coming up now, next TPR session: getting schooled on some jQuery.  Jaclyn Whitehorn presenting.  jQuery is an open source JavaScript library (as if you don’t know that).  It gives you access to all the parts in the DOM of your pages.

Some assorted uses: applying CSS, changing HTML, control behaviours, and implement AJAX.  Good knowledge of CSS helps, as manipulating the DOM relies on it.  Nice, simple examples of doing a hover effect on tables.  Also taking a look at some plugins, such as Accordion.  Keeping in mind that this stuff controls behavior only, not the presentation, which you still need to handle in CSS.

Update 3:53PM ~Nice look at Lightbox.  I’m a big fan of this myself.  Just much nicer looking than showing someone a picture on a white page.  Also, jQuery has a “roll your own AJAX” option.  Plugins also help make using AJAX much easier.  Regarding the UI plugin, it lets you download just the components you need, to avoid needless JS bloat from features you don’t need.

Update 4:56PM ~ TPR6 is HTTP 201, all about user agents.  It’s also Jason Woodward’s session.  Rock on.  Sadly, I didn’t got to HighEdWeb last year, so I missed HTTP 101.  My loss.  Anyway, first up, Wireshark is a nice little program for capturing HTTP header information.  It’s available for Windows or OS X.

HTTP is RESTful.  This stands for “Representational State Transfer.” Actions are performed on Resources and each action is performed independently of other actions, without regard to the presence of the intermediaries between the application and the resource.  This goes for anything, not just HTTP.  Resources are what you identify with a URI.  Actions are things like GET, POST, PUT, etc…  Response codes identify as success, failure, etc…

This idea of hitting HTTP for JSON type information is almost exactly what I want to do to get data out of our home grown portal into pages on the CMS.  Except I’m doing that with XML.  But the principle behind the execution is pretty much the same.  We’re getting plenty in the way of examples interfacing with a web application over HTTP to process GETs and PUTs.  Most all of this presentation was code examples.  Very nice for a change.  I was not aware that if you were writing an application you have to manually code in responses like 405 and 304.

Dinner time.  See all of you for sushi, everyone else, catch you tomorrow!

2 Responses to “HighEdWeb Day 2: Electric Boogaloo”

  1. Says:

    thanks for your useful article. https://www.webmasterclip.com

  2. Says:

    What an excellent post on accessibility issues. I already knew about not spamming the alt tag, but it never even crossed my mind about captcha text’s. I find them highly annoying but a nesseccary evil to stop the spam but never thought about people with visual aid problems and yes the audio captcha’s need looking into.