Skip to content →

Tag: technology

From stone tablet to a bite of the Apple

If you are among those who follow me on Twitter, you are likely already well acquainted with my recent changeover from one mobile phone platform to another. My intention to do this, as well as my reluctance, has long been a topic of discussion among my friends and fellow geeks. I’ve taken quite a bit of good-natured ribbing from people who, for nearly two years, have wondered how on Earth I can claim any expertise in accessibility, when clearly I am using technology from the Jurassic period. What follows is a short exposition on my long-overdue transition from the Windows Smartphone-based Motorola Q to the Apple iPhone 5.

The Moto Q, which my friends have dubbed The Stone Tablet, has been my only mobile device since 2007. To the dubiously named “Smartphone” operating system, I added Mobile Speak, a text-to-speech program by Code Factory. One feature I really liked about the Moto Q was the tactile qwerty keyboard, which made text entry easy. It seemed that most of the new devices were making use of touch screen technology. How could text entry be easy with a touch screen? I wondered. It’s not that I was unaware of the tidal wave of Apple products sweeping over the globe, it’s that I didn’t care. One could hardly avoid the constant din of Apple zealots, though, especially those for whom accessibility is a priority. But my setup served the purpose, it worked for me, and I had no real desire to give it up…that is, until the phone began to suffer from the ravages of old age, and yes, obsolescence.

For a variety of reasons, one of which was the necessity of accepting credit card payments when exhibiting my Elegant Insights Braille Creations jewelry at conferences and trade shows, I decided to at least entertain the possibility of switching to an Apple device, although I had no idea which one. My first foray into an Apple store was over a year ago at holiday time, when I stopped into my local Apple Store to buy a loved one a gift card. While there, I decided to ask the Apple associate to show me an iPad, which seemed like the best option for me at the time, and maybe get a demonstration of Voice Over, the text-to-speech feature built into Apple devices that makes using a touch screen possible for users who are blind.

Upon explaining my request to the associate, I was greeted by an awkward silence, and, according to my companion, a blank stare. “I don’t know what that voice thing is,” the young employee said, “I don’t think an iPad does that.”

“All of your products have Voice Over,” I declared, as confidently as I could, not entirely sure if that was true. “It’s built into the iPad, and if I knew how to bring it up, I’d show you.” Okay, now that was a bald-faced lie, I had never so much as held an iPad or IPhone in my hands, and I just really wanted to see one. But he never so much as let me touch one, since he began to back away, realizing that he would be unable to assist me, and the store was packed with people whom he could assist. I left the store empty-handed, except for the aforementioned gift card.

My interest was more recently piqued, though, when a friend showed me a variety of tablet sizes and models at a recent conference. I marveled at the full-size tablet, which seemed to be nothing more than a wafer-thin sheet of glass, reminiscent of a tray on which I’d served cheese at a dinner party.

After polling some tweeps and conducting a bit of my own research, I decided that in fact the device that would be best for me was the iPhone. While I had really enjoyed paying only $40 a month for my ancient cell service plan, I realized that having the phone combined with the iPad features would solve the most of my problems and meet the most of my needs. So, for my birthday, I decided to buy myself the gift of an iPhone 5.

Before it arrived in the mail, I gathered as many articles, podcasts, and user’s guides as I could get my hands on, and began to prepare for what I was sure would be a steep learning curve. Between the new operating system, the touch screen gestures, and a new speech interface to learn, the entire Apple IOS lexicon loomed large and intimidating before me.

Cutting to the chase, it took only a few days, once I got up and running, to master the device. Now, I can confidently claim fluency. However, it was the part of the process that occurred prior to the ‘after I got up and running,” part that I want to make note of here, simply as a way to help others who may be considering a similar switch. There are a few things you ought to know, and these things can make the difference between delight and utter frustration when it’s time to pull the device out of the packaging.

The first thing you ought to know is, people who know nothing about Apple devices really do know absolutely nothing. There isn’t much that can compare the Apple user experience to other devices that are made by other manufacturers, so do not under any circumstances listen to anyone who does not actually use an Apple product. This may include, but may not be limited to, cellular service providers.

Just to give you one example of what I mean by this, realize that there is a difference between activating the new cellular phone service plan, and activating the device. You may think this point to be obvious, but one hapless Sprint customer service associate who was unlucky enough to answer my call did not. Further, I was told, in response to my question about where I might find the serial number that is required to complete the setup process, I was told that it is located inside the phone. I was told to remove the back panel of the battery compartment, and enter into the phone the numbers printed on the decal.

In case you don’t know, you cannot remove the back of the iPhone. There is no battery compartment from which to remove the back panel, the serial number is either printed somewhere on the packaging, or it is on file with the cellular service provider from which you ordered the phone.

You should also know that it is possible to set up the device yourself, right out of the box, without sighted assistance. However, if you are a person who is easily frustrated, know that there is an easy way to accomplish this, and a hard way. I was determined to get my phone working on my own, but if you know you have a short fuse, just do it the easy way…take the device to an Apple store or the store that supports the cellular service provider, and have them set it up for you. At the time, I had no access to a nearby store, so unless I wanted to wait for someone who was available and willing to drive me some distance, I had few options. I was impatient to get going. Ultimately, though, doing it my way may have actually taken longer than waiting for four wheels and a couple of eyeballs.

Setting up the phone requires quite a bit of data entry, and if you are unfamiliar with how text entry is achieved on an Apple device, it also requires quite a bit of patience. Text entry was a matter of some concern to me, but as it turned out, I caught on quickly, and was able to enter the required information easily enough. What I found frustrating was that I wasn’t always entirely sure I understood what the phone was asking me to do. To express this idea in terms of the English language, the Apple dialect is a bit unfamiliar, word choice, usage, and syntax is different than what I had been accustomed to when using the “stone tablet.”

If you have not yet decided to change your outdated technology to an Apple device, are reluctant, or maybe just reject all things Apple out of hand, one reason you may feel this way could be due to your concerns about privacy. If you are among those still clinging fast to the illusion of privacy, I’m sympathetic. You should know that the moment you complete the setup process of the new Apple device, you have slipped from the edge and are now freefalling into the Apple abyss. You should carefully and thoroughly read the terms and conditions of use, as well as the Apple Corporation privacy policy, and that of the “artificial intelligence” assistant, Siri. Furthermore, you should scrutinize the TOS and privacy policies of any apps you download, whether free or paid. Frankly, I had to delete a number of apps, simply because their privacy policy, a misnomer if I ever heard one, made my skin crawl. If you have not already done so, and you are a blind user who has downloaded some of those object identification apps, you should take the time to learn what happens to the images of the items you photograph. It’s a little disturbing. If you are taking pictures of documents and mail for text recognition,place or object identification purposes, don’t think for a minute that you are the only one privy to the contents of that photo. Same goes for your use of the voice dictation features. There’s more, but I’ll let you make that horrifying discovery on your own.

I’ll say this for my new iPhone: Since it arrived, it has seldom left my side. I have never been one to keep my cell phone strapped to my person, I have never enjoyed using a cell phone, I dislike talking on one, I don’t like the way it makes voices sound, it’s harder to hear, it gets hot in your hand, and other than the few times it has been extremely convenient that I’ve had one, I find the overall experience of using a cell phone to be mostly dissatisfying. Since I’ve loaded up my IPhone 5, however, I’ve come to think of it as simply a hand-held computer that happens to sport a phone. I can easily see a day when I will, as eagerly as everyone else, anticipate the latest release of IOS, the newest app to drop, or the sleekest, lightest, most feature-rich iteration of the device itself. So…What’s next?

LL

Comments closed

Help build an inclusive Twittersphere with Easy Chirp 2

For those of you who follow these things, you already know that Twitter (www.twitter.com), the social media micro-blogging platform, is making changes to its Application Programming Interface (API). For those of you who have no idea what that means, or why it’s significant, allow me to get you up to speed.

According to Wikipedia, An application programming interface (API) is a “protocol intended to be used as an interface by software components to communicate with each other. An API is a library that may include specification for routines, data structures, object classes, and variables.” If you want to read more, go here:

en.wikipedia.org/wiki/Application_programming_interface

Twitter has only had a single version of the API in its entire history. Now, they want to make changes, and update to version 1.1. They have announced new developer “rules of the road,” and have outlined the proposed changes here:

dev.twitter.com/blog/changes-coming-to-twitter-api

The changes will affect all third-party applications that interact with Twitter, such as those you might use as an accessible alternative to the main Twitter web site. Some of these third-party Twitter clients have already completed the necessary adjustments, while others may not even bother, and may simply disappear. Time is running short, however, because Twitter has announced the “sunset” of version 1.0 of the API here:

dev.twitter.com/blog/api-v1-retirement-final-dates

Ever since I first discovered Twitter, I’ve been using the accessible alternative created by Dennis Lembree. Originally called Accessible Twitter, the web-based version now goes by the name Easy Chirp. Due to the changes made by Twitter to the API, Dennis has been forced to reinvent Easy Chirp, soon to be Easy Chirp 2. Dennis needs your help. He has started a kickstarter profile, and needs your pledges. The money raised will be used to compensate the experts Dennis has hired to assist with the project. As usual, when making a contribution to a Kickstarter project, you will receive a thank-you gift commensurate with the amount of your donation. See more info here:

Help build an inclusive Twittersphere: tinyurl.com/c9fsj5v

“I created Easy Chirp over four years ago and am touched by the support it’s received from the community. Now it must be rebuilt due to the Twitter API change, and I hope to collaborate this time with a few other developers.” Lembree says.

Dennis plans some new features and additional streamlining to make Easy Chirp 2 even faster and more accessible. It will continue to support keyboard-only users, will work without Javascript, and will be better optimized for mobile devices. Of course, it will still feature the user-friendly interface you’ve come to expect, useable by people who have a variety of disabilities, and who use a variety of assistive technologies.

Says Lembree: “To me, Easy Chirp exemplifies what a web app should be: platform agnostic, accessible, and simple. It provides a unique and necessary service in the social media space.”

There is no shortage of Twitter clients in the market, which can be used with different operating systems and device types. I use Easy Chirp for my own reasons, not the least of which is that I know Dennis, like him, trust him, and appreciate his work. If you have used Easy Chirp in the past, but have never clicked on that “donate” button just below the sign-in link on the Easy Chirp home page, then scrape a few coins out from between the sofa cushions and send them Dennis’s way. We’ll be tweeting at one another again before it’s time to fly south for the winter.

Pledge to the Easy Chirp 2 Kickstarter here:

www.kickstarter.com and perform a search, or go directly to the Easy Chirp 2 project page here: tinyurl.com/c9fsj5v

For all things Twitter API, go here:

dev.twitter.com/docs/api

You can follow Easy Chirp: @EasyChirp for updates, or you can follow me @Accessible_Info on Twitter as well.

LL

One Comment

CSUN13: A thank-you note to thousands

Upon arriving home from my short trip to attend the 28th annual International Assistive Technology and Persons with Disabilities Conference, I discovered that I was struggling with an odd mix of sensations. Fatigue, from the endless walking through an enormous hotel property, late nights, and early mornings. euphoria, from having met what seemed to be a nearly endless parade of people, all of whom, inexplicably, seemed delighted to see me. Excitement, from learning new things, finding fresh inspiration, and meeting new people. Dehydration, from my refusal to pay $3.50 for a bottle of water, at least more than once. Melancholy, from realizing it might be a long time before I can see some of my friends again. Finally, there was gratitude, for all of the people who work hard to put on a conference that proves to be a success year after year.

Thank you to the California State University, Northridge, Center on Disability (@CSUNCOD). While each conference I have attended over the years seems to have had a personality or flavor all its own, the quality of the presenters, topics offered, vendor exhibits and social event schedule has been consistently high.

Thank you to the Manchester Grand Hyatt (@manchGrandHyatt) for providing conference attendees with what surely must be some of the most well trained and customer service oriented staff anywhere. On one day, while being guided from point A to point B, a trip so long it permitted a complete conversation, I learned that the young lady guiding me was not a hotel employee at all, but a volunteer. As it turns out, she is a local resident with a full-time job elsewhere, but volunteers every year at CSUN conference time just to help us get from place to place. Extraordinary.

Thanks also to the sponsors who made some of the social events possible. The general tweetup was hosted by The Paciello Group, WebAIM, Infoaxia, PayPal, The Center on Disabilities at CSUN, EZFire, OpenDirective, and CA Technologies. Accessible media Inc. (@a11ymedia) and SSB BART Group (@SSBBARTGroup) sponsored two of the receptions I attended. I’m sure there were others not known to me. Please let these fine organizations know how much you appreciated their hospitality. Drop a comment below or send them a tweet, or write them a note if you were personally invited.

A special thank you to my roommate, Jennifer Sutton (@jsutt), who generously shared her space so as to make it possible for me to attend. She’s probably hoping for a less chatty roommate next year.

Finally, I’d like to say thank you to the members of the accessibility and disability community who attended the event. Whether you were a vendor showing off your latest and greatest product release, research, or educational support technology, a presenter, or any one of the thousands of my new best friends who flew in to Sand Diego from far-flung places around the globe, I must say it was truly a pleasure spending time with you.

See you next year!

LL

2 Comments

Optelec to announce new product launch at CSUN13

This just hit my desk, and I wanted to get it posted while you are still putting together your CSUN13 schedule.

Optelec invites you to attend this presentation:
Topic: Diagnostic Tool; Hope for Low Vision Patients

Description: There are many reasons low vision patients are turned away. What if there was a simple inexpensive diagnostic tool?

Track: Blind/Low Vision
Session ID: BLV-053

Date: Friday, Mar. 1 @ 1:50 PM PST

Location: Annie AB, 3rd Floor
Presenter:
Rebecca Kammer, OD
Assistant Director of Optometric Education, Associate Professor College of Optometry Western University of Health Sciences.

Check this out, while you’re walking the exhibit hall: Optelec Booth #205
28th Annual CSUN International Technology & Persons
with Disabilities Conference

Exhibit Hall: Feb. 27 – Mar. 1, 2013

This year is different. We have a NEW product release unlike any other. We listened. We tested. We pushed the limits. We set the standards yet again.

Be there to witness low vision industry history in the making for our official worldwide product launch of the NEW….
Special unveil on Wednesday, Feb. 27th at 3:00 PM!
Where: Optelec Booth #205

The product speaks for itself, don’t miss it…
Point & Read to Stay In Touch!

**Plus, visit our Optelec Booth to learn how you can WIN $100 towards your next purchase**

Follow us on Twitter @Optelec with #CSUN13 and Facebook for announcements and photos!

Contact us at 800.826.4200 or marketing@optelec.com to connect at the show or arrange a demo at the booth.

FREE to ATTEND!
Exhibit Hall Schedule
Wednesday, February 27: 12:00 PM – 7:00 PM
Thursday, February 28: 9:30 AM – 5:30 PM
Friday, March 1: 9:30 AM – 5:30 PM

Optelec U.S. Inc.
800.826.4200 (main), 800.368.4111 (fax)
E: info@optelec.com

www.Optelec.com
See you there!

LL

One Comment

Novel approaches to icon-based AAC presented by Karl Wiegand

One can easily argue that few are as keenly interested in the well-being of a person with a disability as is a parent. Expanding from that core of support one can also include siblings, guardians, educators, social workers and health care professionals. One can further include advocates, friends, spouses and co-workers, all of whom are concerned about quality of life. That covers just about everyone, and just about everyone should be in attendance at Karl Wiegand’s presentation at this year’s Conference on Disability, hosted by CSUN.

Mr. Wiegand is presenting some astonishing work in the field of augmentative and alternative communication (AAC). His presentation, entitled “Novel Approaches to Icon-Based AAC,” will explore two different methodologies for message construction and input. These two approaches can elevate the quality of communication for a person who has locked in syndrome. “Locked in syndrome” is an umbrella term that describes people who may have paralysis to the degree that the individual is unable to move any major body parts, except for above the neck. Even a person who may be in a full body cast is an example of someone who may have near complete lack of motor function, albeit temporarily.

The choices in alternative and augmentative communication devices now commonly involve the use of mouth sticks, switches or eye gaze input devices that can be cumbersome and fatiguing for the user. The current systems were designed based on an assumption that the user can press a button, make repetitious movements, or is able to maintain movement or body position for extended periods, so as to select letters or short words or phrases from choices on a menu. Using letter-based systems can be time consuming, because a letter-based system is more generative than the icon-based system that some users prefer in face-to-face or real time communication situations.

The challenge for Wiegand and his colleagues was to answer the questions: How can you redesign a screen such that you can display a large number of icons, but not all at once, which can be cognitively burdensome? How can icon-based systems be redesigned for faster and more efficient communication, as well as to accommodate users with upper limb motor impairments?

Together with his advisor and colleagues at Northeastern University, Wiegand is working on initial designs of two new approaches to icon-based
AAC: one using continuous motion and one using a brain-computer interface (BCI). The continuous motion system, called Symbol Path, consists of 120 screen icons of semantically salient words. “Continuous motion” means that a user can touch a word to begin a sentence, and without breaking contact from the screen, swipe or drag from icon to icon, ultimately completing a sentence.

His second approach makes use of a practice borrowed from the field of psychology. It is a system that shows icons to a user that represents a word or small phrase, in a serial fashion. It’s called Rapid Serial Visual Presentation. It allows for more efficient sentence construction, rather than presenting the user with a screen full of icons that must be made small in order to offer the user a full compliment of choices, which may be overwhelming.

This method of presenting information in rapid-fire fashion has been used before. If it sounds familiar, you may have once used this same technique if you’ve ever tried to tackle “speed reading.”

“My goal is to build a star trek computer.” Wiegand declares. He went on to explain. “A computer like the one in the program Star Trek, that can understand anybody, and will do it’s best to fill a person’s desires or needs.”

Karl was gracious enough to patiently explain what essential elements of communication would be required in order to make a “Star Trek computer” possible. First, a computer would have to be capable of parsing, which senses for context and speech recognition. Another element would include learning contexts, whereby a computer would understand how people interact with systems and expected responses from users. Finally, artificial intelligence would have to be achieved, enabling problem-solving with incomplete information, and natural language processing.

Until the point at which Mr. Wiegand has utterly changed our lives, and I do not doubt for a moment that he will, Wiegand says he’d like to work on Siri. To achieve his ultimate ends, Karl has worked in a number of other fields that have led him to this research. “I like AAC.” Wiegand continues. “It is a very focused area that is actually a vertex for four or five other fields.”

At CSUN, Karl will demonstrate the SymbolPath system, a prototype version of which is currently available for free on the Android app store (search for “SymbolPath”), show the BCI system, explain how both systems work, and talk about future directions for both. Wiegand hopes to have a system in place at his CSUN session so that attendees who interact with AAC users, friends or loved ones of AAC users, or AAC users themselves, can help create a corpus — a data set that shows what certain users want in certain times or settings or situations.

“We have revised both approaches based on initial testing and user feedback, and we are currently conducting several iterations of user-assisted design and revision before proceeding to full user testing.” Wiegand notes.

Attendees can help build this database by contributing realistic text, utterances, or phrases that AAC users like to say. If you attend the session, or find Karl throughout the week, you can contribute to the database or ask questions. In exchange, Karl will give you a copy of Symbol Path.

Karl will be presenting on Friday, March 1st at 3:10 pm in the Ford AB room, third floor.
Here is the link to the session page:
bit.ly/15yOOND

More about Karl Wiegand:

Karl Wiegand is a Ph.D. student in computer science at
Northeastern University in Boston, Massachusetts. He works in the
Communication Analysis and Design Laboratory (CadLab) under the
advisement of Dr. Rupal Patel. Since joining the CadLab in 2009, Karl
has been working on alternative methods of communication for users
with neurological
impairments and severely limited mobility. His research includes
aspects of interface design, artificial intelligence, and language
theory.

Here are more ways to contact Karl, and help with his corpus gathering project:

Karl Wiegand’s homepage: www.ccs.neu.edu/home/wiegand/
Karl’s lab: www.cadlab.neu.edu/
Link to Karl on LinkedIn: www.linkedin.com/in/karlwiegand/

Finally, if you know or love an AAC user, you can help get the ball rolling on data-gathering here:

www.cadlab.neu.edu/corpus/

Don’t forget to use hashtag #CSUN13 when tweeting about the event. See you in San Diego!

LL

2 Comments

Sina Bahram to present an accessible, gesture-based approach to controlling classroom technology

There are any number of reasons one might attend a particular session at the upcoming 28th annual International Assistive Technology and Persons with Disabilities conference. You might want to learn more about a ground-breaking awareness project, you might want to learn a new skill, you might want to find fresh inspiration for your own work. One reason to attend Sina Bahram’s session is that he has helped to solve a problem that has affected educators, lecturers, or corporate presenters who are blind or visually impaired, as well as people who use tech automation in the workplace. He will discuss an accessible, gesture-based approach to controlling the technology in either a classroom or corporate setting.

Sina Bahram is a technical consultant and accessibility researcher pursuing his PhD in the Department of Computer Science at North Carolina State University. His field of research is Human Computer Interaction (HCI) with a focus on the use of innovative environments and multi-modal approaches to facilitate eyes-free exploration of highly graphical information. Combining artificial intelligence, intelligent user interfaces (IUI), and HCI, Sina devises innovative and user-centered solutions to difficult real-world problems.

Bahram’s session will show you how an instructor who is blind can independently give a presentation. typically, when using the technology available to a sighted presenter, there are barriers imposed by the device that is used to control the projector, the microphone, document camera, and other input devices. This controller, usually either a Crestron or AMX technology box, allows for many inputs that can be managed by way of a touch screen. This touch screen interface is inaccessible to blind instructors, and presents numerous difficulties for a speaker or educator with low or no vision. For example, without sighted assistance, there is no way to know the state of readiness of the technology being used. There is no feedback alerting the presenter as to whether the projector is warmed up, or how he or she might adjust the volume level of the audio. Bahram will discuss and demonstrate how this approach to an embedded system allows blind or vision-impaired instructors to control classroom technology.

The project is a collaboration between North Carolina State University, Bahram, Ron Jailall, who works in control systems programming and classroom design, and Greg Kraus, who is Coordinator of Campus Accessibility. They have devised an approach whereby simple gestures, swipe up, down, and to the right, are used to move about various screen elements. Further, computer-generated speech is used to provide menu and status information.

“We have an underrepresentation of persons with disabilities in science, technology, engineering and math (STEM),” says Bahram. “In particular, people who are blind or visually impaired. This is one of the approaches that can help address this problem, in a small way, without having to depend upon a teaching assistant or student to assist. Now, a blind instructor can manage classroom technology independently.”

No matter the context in which you give presentations, craft accessibility policy or purchase tech for employees or students who are blind, this session is for you. No special skill level is required to attend. All are welcome. Sina will be available for questions, demonstrations, and further discussion, at any time you can catch him throughout the conference week.

More about Sina Bahram:
In 2012, Sina was recognized as one of President Barack Obama’s Champions of Change for his work in enabling users with disabilities to succeed in Science, Technology, Engineering, and Math (STEM) fields. You can read more about Sina and his research on his website, www.SinaBahram.com, or follow him on Twitter via @SinaBahram.

Be sure to check out the links below for more information.

For further ways to contact Sina, see his contact page at:
www.SinaBahram.com/contact.php
Read Bahram’s blog here:
blog.SinaBahram.com
Discussion of an Eyes-Free Approach to Controlling Classroom Tech:

Demonstration of an Eyes-Free Approach to Controlling Classroom Tech:

For more videos on other topics, Sina’s YouTube channel is at:
www.YouTube.com/sbahram

Don’t forget to use the hashtag #CSUN13 when tweeting about the event.

LL

2 Comments

The 2013 Assistive Technology and Persons with Disabilities Conference

If you are a person who has a disability, or if you know or love someone who does, you will soon have an opportunity to attend what could be a life-changing event. If you have never before attended the International Conference on Disability, presented by California State University, Northridge, I am going to work hard over the next few weeks to give you some compelling reasons to attend. This annual conference is the largest of its kind, and each year showcases the very latest assistive technologies, teaching techniques and best practices for web and mobile accessibility development, as well as the latest in disability-related policy news and legislation. You’ll hear inspiring words from thought leaders and educators, and you can experience the camaraderie and fellowship of others who may be living with a disability similar to your own. If you can only attend one event this year, this is the one to attend. There is truly something educational, fun and uplifting here for everyone.

Start with this link, below. It will take you to the main page, where you will find all the info you need. Attendee registration is now open, so make your plans soon.

www.csun.edu/cod/conference/2013/sessions/index.php

If you want to explore the full list of educational sessions, click this link:

www.csun.edu/cod/conference/2013/sessions/index.php/public/conf_sessions/

You will be amazed at the range of topics, and the depth to which they can be explored. If you are not a technology fanatic, don’t worry. There are sessions on just about every aspect of disability awareness, accessibility and advocacy. All levels of expertise are addressed at many sessions, so don’t let intimidation or feelings of technical illiteracy keep you away.

There are also some social events you can attend. For example, The Paciello Group, WebAIM, Infoaxia, PayPal, The Center on Disabilities at CSUN, EZFire, OpenDirective, and CA Technologies will coordinate and sponsor a tweetup at the CSUN Technology & Persons with Disabilities Conference. The tweetup will be held Thursday, February 28th at 6:00pm at the Manchester Grand Hyatt, San Diego. Additional details will be coming soon. The tweetup is open to all Twitter users, but attendees are asked to RSVP.

csuntweetup.com/

Finally, be sure to use the hashtag #CSUN13 when tweeting about the conference. Check back here throughout February, as I will be showcasing a few of the presenters you can look forward to seeing at the conference. Make your travel arrangements early, and I look forward to seeing you there. You can follow me at @Accessible_Info on Twitter, so tweet me up so we can meet!

LL

2 Comments