Wednesday, March 29, 2006

Credibility Commons: Improving Access to Credible Information?

Librarians and information specialists often express concern that the quality of online information varies widely and that credibility is an major issue. Consumers often decide whether to use the informaition on a web site based how professional the site appears or the site's information matches their personal views.

To address this issue, researchers from Syracuse University and the University of Washington have received a two-year $250,000 grant from the John D. and Catherine T. MacArthur Foundation to establish a Web service being called the Credibility Commons. Among other areas of research, the Commons will investigate the creation of a search engine that would direct users toward sites to which skilled searchers (reference librarians) frequently direct their customers.

An experimental Reference Extract site is already available.

According to the Credibility Commons project site, the project is an "experimental environment enabling individuals the opportunity to try out different approaches to improving access to credible information on the World Wide Web. Tools will be provided to researchers as well as the public, allowing them to try out search strategies, collections and other approaches to improving access to credible information. The Commons can be viewed as a collaborative space in which to share ideas, data sets, results and innovations."

The commons will have three primary components:

  • TriPart Research: a series of studies the span individual credibility behaviors to collaborative group credibility actions. These research activities are tied to an ongoing coherent research agenda. Research takes data from external studies, information from public use of credibility tools and the evolving capabilities of Internet tools.

  • Tools: working with developers and information providers regardless of industry sector or commercial status to incorporate new credibility tools and mechanisms into a wide array of information products. The commons shall facilitate translating the research of leading scholars and organizations into real tools and mechanisms.

  • Public: tools and research of the commons shall have validation in real use. A major component of the Commons shall be a public Internet presence with tools, tutorials, research reports, consumer guides, and means of taking in user feedback.

The Credibility Commons is an outgrowth of the American Library Association’s Office of Information & Technology Policy and the University of Washington investigation into the issues of credibility of Internet information. Sphere: Related Content

Tuesday, March 28, 2006

Are OSS4Lib Networks Needed?

I just finished reading Rachel Singer Gordon and Michael Stephen's column in Linux Insider regarding open source for libraries. Like many articles on the topic it focuses on the uses of open source software in libraries, something I wholeheartedly support!!

However, I have a problem with libraries and open source software.

Libraries have traditionally banded together to pool resources with the common goals of obtaining monographs, serials, and databases as economically as possible. The decision to develop or become involved with a library network is to affect a positive change on a library’s ability to plan and budget.

Unlike these cooperative efforts of the past, libraries have chosen to work independently on information system solutions. With the significant costs involved in the purchasing and maintenance commerical information systems why haven't more libraries banded together to build library systems?

The open source for libraries approach to system development is different from past efforts to build “homegrown” information systems. Frequently, libraries that attempt to develop their own systems lack all the human elements to create scalable and portable systems. An IT staff with programming, testing, evaluation, troubleshooting, and user education skills is needed to create such systems. While a single library may lack the resources, a group of libraries working together has a greater chance of assembling a development team with a full complement of these skills.

An open source network can also serve as a peer review system that is missing from many homegrown development projects. When the programmer of a homegrown system leaves employment the system gradually falls apart and dies. With open source development someone in the development community usually takes over the management responsibilities of viable and useful systems and they continue to evolve.

In a continuing era where budgets are trim and the need for innovated and flexible library systems growing, libraries need to begin establishing new resource sharing networks that focus on the development of information systems needed to support Internet-based library services.

However, in order for this to happen library administrators need to refocus their vision. They need to begin viewing open source products as a commerical alternatives. They need to begin reallocating human and fiscal resources into the development of new systems that can change and adapt as fast as our environment. They need to rethink the services their library offers and how those services are delivered.

Otherwise, there will be too many 1.0 libraries struggling to exist in a 2.0 world. Sphere: Related Content

If You Read This Entire Web Page You Are An Exception To The Rule

In 2000, Jakob Nielsen authored the book Designing Web Usability: The Practice of Simplicity which emphasizes fast and efficient methods for improving the quality of user interfaces. He is currently a principal of the Nielsen/Norman Group and has been writing the AlertBox column since 1995.

Mr. Nielsen has recently taken to the road giving four seminars based on a recently completed an eye-tracking study that investigated how users consume Web pages. Nielsen Study According interviews with ZDNet's Dan Farber and with USA Today's Edward C. Baig, "...peoples' eyes flitter fast across pages. Very little time is allocated to each page element, so you have to be brief and concise in communicating online."

According to the study web users don't look across the lines of a page. People look down web pages in an 'F' pattern with the right-hand side of a page often never viewed. ( So, what are your eyes doing over here?) Sometimes the eye tracking results in an 'E' pattern but the study finds it's usually an F.

The Nielsen/Norman Group asked more than 230 participants to research specific tasks and company information online. Tasks included learning to tie a type of knot called a "bowline," shopping for a mortgage and deciding whether to adopt a cairn terrier or pharaoh hound from an animal shelter.

The study also discovered that pictures, images and moving objects tend to be more of an obstacle. Image quality is a major factor in drawing attention with people in pictures that face forward and look directly at the user being more inviting and approachable. Pictures that provide useful information, not just decoration, are more effective and images in the middle of a page are problematic.

Advertisers should be concerned since the study revieled that web users will peek at ads in search engines as a secondary task since they usually have specific product targets in mind. Since users fixate on the first few words of a headline, but often only for a tenth of second,concise headlines need to be used.

By understanding the F principle, Nielsen feels that writing eye-catching and usable sites is possible.

I wonder how many libraries will revisit their sites and make changes based on this study? Sphere: Related Content

Monday, March 27, 2006

Broadband over Power Lines: Causing Static?

In 2004, the Federal Communications Commission set up a rules for companies rolling out tests of a technology called Broadband over Powerlines (BPL). BPL, commonly referred to as HomePlug or powerline communication (PLC), is a technology that uses radio waves, transmitted over power lines, to provide broadband Internet or other data connectivity.

BPL is attractive because of the power grid's ubiquity. It has been touted as a "third wire" into the home and a way to bring high-speed service to rural areas underserved by cable and phone companies. Such ubiquitous availability would make easier to attached other electronics, such as televisions, to a network. BPL offers obvious benefits over regular cable or digital subscriber line (DSL) connections since an extensive infrastructure is already available.

BPL service is established by inserting a radio-frequency signal on to the power line, much like a high-frequency signal is applied to phone lines to create DSL. The college radio station I worked at utilized a similar technology called carrier current to broadcast one of our stations to the dorms.

Unlike phone and cable wires, power lines that run above ground can act as large radio antennas, emitting the high-frequency signal as radio waves. In order to get broadband speeds, BPL uses a large number of frequencies. However, radio waves can create interference in other electronics. This is why air travelers are instructed to turn off their electronic devices during take off and landing. CableTV-based broadband gets around this by placing cabling with shielding underground. If interference continues to be a problem for BPL, it could result in upgrades to power networks and eliminating the cost effectiveness of BPL. The actual BPL signal itself could also be interfered with by other outside sources.

According to the American Radio Relay League , the national ham radio association, radio waves from an improperly designed system can drown out amateur radio within a quarter of a mile. A report issued by the National Telecommunications and Information Adminstration (NTIA) concluded that a BPL transmitter operating within limits would significantly increase the noise for a vehicle-mounted receiver operating in a residential neighborhood next to a BPL-energized line and that it "may experience harmful interference" depending on the frequency.

While the technology has great potential to close that final mile we will have to wait and see if it cleans up its signal enough for widespread deployment. Sphere: Related Content

Wednesday, March 22, 2006

(Yet Another) Browser Toolbar?

Every major search company has their own browser toolbar that leads the user to additional resources and services within that brand.

In another post I commented that information access issues were perhaps the result of the way people are now using the web. People are using many more tools today and the concept of going to a library "home" page is becoming antiquated. So, how do you get users to the resources that may not want to go to the library web site?

About a year ago I proposed to our staff the idea of a library toobar that provides quick access to library resources. Completing priorities prevented further exploration. Well, today I created one for our library - in about 60 minutes.

In my web wandering I came across a free community toolbar creator offered by Effective Brand. This tool has been used to create toolbars at libraries including the Harris County (Texas)Public and Lansing (IL)Public .

A browser toolbar focused on library resources allows the library to push library resources and web contact access points to users as they are our browsing elsewhere. It saves the users time (Ranganathan would be proud) by not having to drill down through their favorite's list or shortcut bar. Since the bar's content can be updated automatically, any changes are automatically pushed out. The user can select to use all the toolbar items or only some of them.

The creation tool itself is very simple to use and requires no real skill other than knowing how to cut and paste. I had no problems except for some reason I had to leave the News Search item available or the eight default search options would magically reappear. Sphere: Related Content

Librarian 2.0 Requires New Skills

I started hitting the blogs again after reading the results of our library's LibQual+ survey. The results indicate that customers are unsatisfied with electronic access to library resources. While some of my colleagues point to the possibility that our web site and potential usability as the issues, I point to the possibility that people are simply using the web differently.

In late 2005, discussions about the concept of Library 2.0 began, which is concisely described in white paper by Ken Chad and Paul Miller. Chad and Miller propose that, “ Library 2.0 is a concept of a very different library service that operates according to the expectations of today’s library users. In this vision, the library makes information available wherever and whenever the user requires it." There are concerns that the Library 2.0 concept may disenfranchise those who need us most, as expressed by Michael Golrick

Since many are still not aware of this concept it is worth repeating.

The Principles of Library 2.0 ( a summary of Michael Stephens summary)

The underlying theme to Library 2.0 is the realization that library user communities are constantly changing and that library services must change proportionally. Libraries need to look at all our services and ask if they they still serve our customers? Do these services serve a large-enough group that the return on the investment in human and fiscal resources is positive?

- The library is everywhere. Outreach via technology should be the goal of every organization. Librarians need to get out from behind the reference desk.

- The library has no barriers. Make sure that library users can get to information no matter where they are.

- The library invites participation. How can libraries reach out and interact with our users? Can a library present a tag cloud for the physical browsing of a library collection? Do library systems include built-in RSS feeds, tagging, and user commenting?

- The library uses flexible, best-of-breed systems. Libraries often make decisions about technology in libraries without much thought as to how systems interact. Are libraries licensing applications that will work for all of users no matter where they are?

- The library encourages the heart. The library will be a meeting place, online or physical, where my needs will be fulfilled and will allow users to create. Library's need to position themselves to help with finding the answers to how? and why?"

- The library is human. Users will see the face of the library no matter how they access services. Librarians will guide them via electronic methods as well as in person. Versed in the social networking tools, librarians will be able to adapt to the changing world. They will encourage and educate users.

- The library recognizes that its users are human too. Online and phgysical library space will be full of collaboration and conversation.

At issue is that the expectations of libraries for online interaction are changing. Users want to be content creators and contributors, not just consumers. We are seeing the emergence of the user-centered, online universe where people do most of their work, and play, within their web browser.

A discussion of the effect of Library 2.0 on librarians has to include the new skills that librarians need, which was a topic which Michael Stephens recently covered. We are at a time in our professional evolution where every librarian needs to have a core set of technnology skills. A list of such skills appeared in the June 2005 issue of T.H.E. Journal.

As technologies continue to change and evolve, librarians must continue to strive for excellence in their work. Today that includes continued time and effort to maintain and improve their technology skills. To evolve, not only do librarians need to rethink their skill sets, libraries need to reevaluate and redefine certain jobs including duties that include creating and using online tools for collaboration and creation.

The challange is what duties and processes need to be taken off job descriptions to make room for such tasks? How does one retool version 1.0 librarians who focus on library 1.0 services? Do we still need to have some 1.0 librarians in our organizations? Does the organization structure of a library need to be flattened to include more workgroups and teams? This is a significant challenge for library administrators. They need to make some hard choices - now - in order to avoid their library becoming irrelevent. Sphere: Related Content

Tuesday, March 21, 2006

Can I Cut My Land Line Now? Please?

Like most digital cameras, my new one has a feature that provides a shutter sound when I take a picture. The sound really serves no purpose, but it makes me feel comfortable that it is actually doing something.

I have not have had cable TV service at home since 1998 (a DirecTV subscriber). I swore to my wife that I would never let a cable installer in my house again. We suffered with a land line for network connectivity until 2001 when we finally bit the bullet an allowed the cable TV installer in the house to install a broadband connection.

In 2004, we decided to further reduce monthly telecom costs and switched from a traditional service to Vonage's voice over IP. I wish I could drop our land line all together, but we can't.

DirecTV requires a land line connection in order to set up its service. They have not embraced the IP concerpt - yet. The DirecTivo service we subscribe to requires a land line so the built-in modem can call out to update the box and service information. DirecTivo prefers a traditional land line over voice over IP since the internal 56k modem speed can not be slowed down to deal with the VOIP service. It works about 25% of the time, which is good enough. The box/service does not have IP capabilities, nor will it (this is really a topic in of itself).

Another thing that keeps us from dropping our land line is my wife's 89-year-old grandmother. Call forwarding confuses her enough when she calls the house and we pick it up in the car, let alone the concept of us not having a phone in the house.

I imagine if phone-based services survive at all they will do so as some form of emulation, generating a fake dial tone on a digital network. The sound will really serve no purpose, but it will make us feel comfortable that it is actually doing something. Sphere: Related Content

Friday, March 17, 2006

MovieBeam: Blockbuster's New Release Section in a Box?

In 2003, Disney announced a service called MovieBeam, a video-on-demand system aimed squarely at services like Netflix and TiVo. In early 2005 the service was taken offline, but has recently received a second life with additional backing by Cisco Systems, and Intel.

The MovieBean device comes preloaded with 100 movies. 100 movies is not really alot if one considers the studios which are backing the product. I wouldn't expect to see Star Wars or Harry Potter any time soon. Once a movie is on the device a customer can order it with the remote control and is viewable for only 24 hours. New releases will cost $3.99 while back-catalog costing $1.99. High-definition films will cost an extra dollar.

Each week, up to 10 new movies Movies replace older titles. As new movies are automatically downloaded, older movies are automatically deleted to keep the number at a constant 100. The limited selection of movies may keep interest limited to those who frequent Blockbuster's new release section.

MovieBeam currently uses spare television broadcast bandwidth, usually that of a local PBS station, to send data to the 160mb hard drive in the set-top MB2160 boxes, co-branded with Cisco's Linksys division. An Internet version is in the works. The MovieBeam set-top box is priced at $200 after a $50 introductory rebate. The antenna-based setup reduces networking setup headaches and broadband download speed limitations, it also limits the availability of the product to those areas above where MovieBeam-compatible transmitters are in place.

Besides the initial hardware cost there is a one time activation fee of $29.99. There are no monthly charges, only the cost of the movies. While the lack of a subscription fee is always a plus, paying up to $5 if it's in HD seems steep when limited to a 24-hour viewing window.

Only about 10 percent if the movies offered are available in native HD format. The player does features an HDMI connection that supports up-conversion from standard-definition content to high-definition output. A typical high-quality DVD movie file occupies about 8GB, but according to MovieBeam its file sizes are 1-1.5GB for standard-def and 5GB for high-def movies. Even with MovieBeam's superior Windows Media 9 compression, that's pretty small for an HD movie, and isn't a good sign for having a satisfying HD experience. The movies they have available via HD are not the ones that would benefit from an enhanced viewing experience.

The MovieBeam service is currently available in 29 metropolitan areas including Atlanta, Baltimore, Boston, Buffalo, Chicago, Cleveland, Dallas, Denver, Detroit, Houston, Jacksonville, Las Vegas, Los Angeles, Memphis, Minneapolis/St. Paul, Nashville, New York City, Orlando, Philadelphia, Phoenix, Portland (Oregon), Salt Lake City, San Antonio, San Diego, San Francisco, Seattle-Tacoma, St. Louis, Tampa and Washington D.C. Sphere: Related Content

Friday, March 10, 2006

The Ultra Mobile PC: Is it Needed?

The Ultra Mobile PC (UMPC) is a new category of mobile devices optimized for specific on-the-go usage while providing the capability and versatility of a PC. UMPC devices are small enough to carry, a long battery life, have multiple wireless options to be connected anytime, and location aware so they can adapt.

The hope of the UMPC is to allow individuals to access online games, videos, and music with the quality obtained in front of a wired PC. The UMPC also connects people via email, VoIP, instant and text messaging. The UMPC platform is also expected to have GPS capabilities which allow it to recognize its whereabouts and provide local information.

The first of these devices to be marketed under the Microsoft Origami label were finally unveiled at the CeBIT trade show. The show had devices built by Samsung, ASUS, and Founder on display. The technology was also discussed at the Intel Developer's Forum.

The concept is built on top of the Windows XP operating system, has a larger screen than a handheld but smaller than a notebook PC screen. Origami devices won't fit in the pocket but into the smallest of backpacks. The goal is to create a device that could eventually sell for $600 or less and capable of supporting features like GPS, Bluetooth, 3G cellular technology and Wi-Fi. It features small and lightweight hardware designs coupled with the full functionality of a Microsoft Windows-based PC and a choice of input options including touch-screen.

The size of the platform fits in between a tablet PC and a Sony PSP. Functionally, it fits in between a laptop and a high end PDA. The problem with this device right now is that everyone who might buy it already has a device with the basic capabilities that is working just fine.

The challenge to this product will be convincing people that both the devices and the way they are currenting doing things are inferior. Current mobile device users (PSA's, Blackberries) will look at the device and say it is too big since they are looking for a small device that fits in their pocket. Since it does not run the full version of Windows XP, laptop/tablet users will find that it may not have enough processing capability.

It may, however, be a form factor that would be very attractive in some applications, such as a physician making rounds in a hospital. It would have a large enough screen to view radiology images and test results without lugging a tablet around. Sphere: Related Content