Wednesday, December 27, 2006

Is it a 10-7 for 10-codes?

Anyone old enough to remember TV shows such as Dragnet or Adam-12 knows that in police-speak the phrase "10-4" means "OK." The use of the 10-codes dates back in the 1920's the police radio band consisted of only one channel. In order to communicate effectively the 10-code system was developed to describe most police actions.

Each law enforcement agency began to adopt their own version of the codes. For example, a 10-32 in Whitfield County, GA means "subject with firearm" while in Volusia County, FL it refers to "breathalyser available." A 10-34 in Whitfield means "open alcohol" while in Volusia it means "jail break."

In the aftermath of Sept 11th, the Department of Homeland Security began to "encourage" local authorities to drop 10-codes in favor of plain language in an effort to facilitate communications between law enforcement agencies.

Soon, people that hear someone say "10-4" will have no idea what it stands for. That is unless CB radios make a comeback. Sphere: Related Content

Wednesday, December 20, 2006

"TANK U" Very, Very Much!

I know this has been posted elsewhere, but it is such a great idea that it deserves plenty of coverage. A prototype download station, called TANK U, was presented at a November seminar in the Netherlands by Edo Postma of ProBiblio and Eppo van Nispen tot Sevenaer (Delft Public Library).

TANK U is a physical station that downloads information onto mobile phones. After selecting the content requested, the data is transferred to the mobile phone using Bluetooth and is immediately ready for use. The content could consist of music, e-books, film trailers, audio books, lists of new additions in the library, college lectures and audio tours. The content is selected by the library and is meant to give the customer an idea of what the library has to offer.

I see it now. Download vending machines are placed outside of lecture halls. Audio/video of the lectures is uploaded into the machine in real time and available for immediate download. Students capture the content on their Bluetooth devices without needing to find a network connection.

What a great idea!

Hmm.

As I have been thinking about being Lorcan Dempsey's "in the flow" I have been limiting myself to the online world. This changes my thinking a bit. This type of technology puts the library into a customer's physical flow. There are plenty of places where potential library customers people gather or wait, such as in hospital waiting rooms. Such kiosks not only give another distribution point, but provides a great deal of marketing exposure for the library.
Sphere: Related Content

Thursday, December 14, 2006

Look Out Web 2.0, Web 3.0 is Coming ! (Is Here?)

From the "only constant is change" department....

Late last year I started seen some mention of the concept of Web 3.0 on various blogs. After all, such a designation is the logical next step in an era of everything "dot oh. Personally, I am not a big fan of "dot oh" since it it forces individuals to think in a linear manner, which in turn forces individuals to think about steps required to transition to the next level, which in turn.... well, I digress.

The concept of Web 3.0 is really just a re-branding of the semantic web for mass consumption and is the next step in Web evolution. Those who have discussed this next phase see it as an era where Web applications will (finally) start to do some seemingly intelligent tasks with the software itself discovering and make associations between far-flung bits of information. In other words, the mark-up language used to create each Web page would be written to dynamically cross-reference countless other data sources. The content itself will be able to convey more meaning and interactivity than the relatively static sites of today.

For example, an attendee visiting a conference Web site could select a session from the online program and immediately transfer the date, time, and location to an electronic calendar. The location of the conference itself including address, latitude, and longitude could be sent to a GPS device. The names, biographies, and contact information of other attendees could be sent to an instant messenger contact list.

The problem I see is that commercial library vendors have development cycles that move at glacier-like speeds. Our reliance on commercial vendors means that libraries will start seeing application which take advantage of semantic Web concepts some time after Web 4.0 talk is well underway. Sphere: Related Content

Friday, December 08, 2006

Is MS Office OpenXML Document Format Nirvana?

The other day I was weeding my office and came across a handful of old 3-1/2 inch floppy discs. After finding a box that actually had a floppy drive, I opened the contents for viewing. To my surprise I found some old WordStar documents. They would not open in Word.

While there are support groups and projects like the Long Now Foundation's Format Exchange that provide tools would have helped me to open the files, the experience started me thinking about how many documents would be lost simply because the formats will become obsolete.

Microsoft has been getting some press since it received recent approval for its Office OpenXML (OOXML) format to be sent to ISO for consideration an international standard. Microsoft's OpenXML format is based on open standards and is available royalty-free. However, OOXML is not the only option to office document compatibility nirvana.

The Open Document Format (ODF) is actually winning in the overall standards race. On Dec 1, 2006 ISO ratified ODF as one of its official standards: ISO/IEC 26300:2006. This standard was developed by the Organization for the Advancement of Structured Information Standards (OASIS), a global consortium and is based upon the XML format originally created by OpenOffice.org. The OpenDocument specification is available for free download and use.

Corel, the maker of the WordPerfect Office suite intends to support both OOXML and ODF and in future versions.

While the debate between the OOXML and ODF is likely to go one for the next couple years, the discussion will certainly pressure vendors to ensure that documents we create today will be accessible tomorrow, providing we can find a device to read the media they are stored on. Sphere: Related Content

Monday, December 04, 2006

McMaster U Getting Out of Cataloging Business

When looking for interesting ideas and trends I often look up north to our friends at McMaster University Libraries. It seems like they are always doing something when most libraries are still just talking about it.

It was therefore no surprise that this morning brought the post Getting Out of the Cataloging Business. McMaster is reallocating their cataloging staff to support emerging services. Original catalogers will be moving to “tiered reference”. Copy cataloging will cease. Staff will be provided new titles and responsibilities:

Digital Strategies Librarian: This position will be responsible for developing their digital library program including the digital infrastructure, implementation of an institutional repository; and developing strategies to align them with programs at the provincial and national level.

Systems Librarian: Responsible for managing systems and emerging technologies.

E-Resource Librarian: This position will be responsible for managing e-resource licenses.

Training and Development Librarian: This position will assist with staff development and work with the librarians to develop an understanding of curriculum development. Sphere: Related Content

Friday, December 01, 2006

Blackboard's Patent Being Challenged

An article by Dan Carnevalle (subscription required) appearing in today's Chronicle of Higher Education discusses a request presented to the U.S. Patent and Trademark Office by the Software Freedom Law Center to re-examine Blackboard's patent (U.S. 6988138) for Internet-based education support systems and methods. A press release also appears on the Center's web site.

The patent grants Blackboard a monopoly on most educational software that differentiates between the roles of teacher and student until the year 2022. The Center provided examples of "prior art," a patent law term that refers to similar technology created earlier by a different party and that could be used as evidence to undermine a patent. In this case, the evidence consists of similar software that existed before Blackboard filed for its patent in 1999.

Blackboard's patent contains 44 claims, two of which are independent (#1 and #36). Each claim in a patent is either an independent or dependent claim. Independent claims stand alone while dependent claims are used to narrow the scope of the independent claims. A product that does not infringe on the independent claims by definition does not infringe on the dependent claims. If Independent claims are not infringed then the patent is not infringed. The independent claims are:

1. A course-based system for providing to an educational community of users access to a plurality of online courses, comprising: a) a plurality of user computers, with each user computer being associated with a user of the system and with each user being capable of having predefined characteristics indicative of multiple predetermined roles in the system, each role providing a level of access to a plurality of data files associated with a particular course and a level of control over the data files associated with the course with the multiple predetermined user roles comprising at least two user's predetermined roles selected from the group consisting of a student role in one or more course associated with a student user, an instructor role in one or more courses associated with an instructor user and an administrator role associated with an administrator user, and b) a server computer in communication with each of the user computers over a network, the server computer comprising: means for storing a plurality of data files associated with a course, means for assigning a level of access to and control of each data file based on a user of the system's predetermined role in a course; means for determining whether access to a data file associated with the course is authorized; means for allowing access to and control of the data file associated with the course if authorization is granted based on the access level of the user of the system. "

36. An method for providing online education method for a community of users in a network based system comprising the steps of: a. establishing that each user is capable of having redefined characteristics indicative of multiple predetermined roles in the system and each role providing a level of access to and control of a plurality of course files; b. establishing a course to be offered online, comprising i. generating a set of course files for use with teaching a course; ii. transferring the course files to a server computer for storage; and iii. allowing access to and control of the course files according to the established roles for the users according to step (a); c. providing a predetermined level of access and control over the network to the course files to users with an established role as a student user enrolled in the course; and d. providing a predetermined level of access and control over the network to the course files to users with an established role other than a student user enrolled in the course.


Sphere: Related Content

Wednesday, November 29, 2006

Thoughts from a Marshall Breeding Lecture

I was over at OCLC this morning to listen to Marshall Breeding speak at their distinguish seminar series. If you ever needed information about the library automation industry, he's the one to ask. His lecture was entitled "Trends in Library Automation: Meeting the challenges of a new generation of library users." Marshall has been a great contributer to the library profession over the past 20 years. I had a few thoughts afterward.

Commercial vendors need to recoup development costs and make a profit in order to survive. They rarely develop enhanced systems until they find that critical mass. The affect this has is that many libraries must delay innovative services waiting for commercial vendors develop and market their solutions. We shouldn't have to tell our library vendors we need systems that can expose our content using Web services and then have to wait years for a critical mass to understand what that means before we see them.

Having already invested significant resources into the deployment of vertically designed proprietary systems, libraries have become locked into a very effective commerical vendor paradigm that most cannot afford to break away from.

It is like an old car that keeps breaking down. If the owner can not afford a new car they keep spending money to repair it. "What is another $1000 repair when I have already put $2000 into it?" The more money that is spent repairing the old car makes it even more difficult for the owner to afford to buy a new one. The owner not only becomes trapped with the old car, they unable to take advantage of all the new features and technologies that the newer models have.

I feel that in many ways libraries are trapped in a constant cycle of repairing the ILS.

If a library has so much invested in an ILS will they simply wait until it breaks down on the side of the road and leaves them stranded? Will purchasing a replacement simply put the library back into the repair cycle? At that point, shouldn't the library be looking at alternative modes of transportation? Sphere: Related Content

Monday, November 27, 2006

Is Web Experience the Same as Web Expertise?

As I was reading a couple recent posts about usability I came across a 2004 paper entitled "Older Adults and Web Usability: Is Web Experience the Same as Web Expertise?" by Ann Chadwick-Dias, Donna Tedesco, and Tom Tullis.

Their research suggests that Web experience is only a part of the equation when it comes to Web expertise. In fact, the strongest predictor of expertise was age independent of experience. In general, they found that older users have more usability problems when using the Web, independent of Web usage patterns (frequency of use, long-term use).

Their research found older adults demonstrated less Web expertise than younger adults. In fact, Web expertise is significantly influenced by how users learned the Web. Specifically, the cumulative time spent in collaborative learning environments (learning from and with others) rather than just how long or how often they have used it. The absence of collaborative learning is a part of the reason older adults have a lower level of expertise when the level of experience is controlled.

I often notice that library web usability study instruments frequently ask the participant about their Web experience. In fact, should we be more concerned about their Web expertise level? I wonder what impact, if any, this differentiation could have on library Web site usability studies? How is Web expertise defined and assessed? If we design for the older user, is the site less usable to the younger user? If we design for the younger user do we create usability problems for the older user?

Should libraries not be spending any time on usability at all? Should we be creating basic Web sites and instead spend our time making our resources findable regardless of the access method? Sphere: Related Content

Tuesday, November 21, 2006

A Zero-Day Virus Attack

Earlier this month a part of our University was hit with a zero-day virus attack. I had not heard of the phrase zero-day before now and suspect most people only learn about the concept when one occurs at their place of work.

Zero-day refers to a class of computer threats that exposes undisclosed or unpatched application vulnerabilities. Zero-day attacks can be considered extremely dangerous because they take advantage of computer security holes for which no solution is currently available. Zero-day attacks are difficult to defend against and are often effective against secure networks and can remain undetected even after they are launched.

A Zeroday Emergency Response Team (ZERT) is a group of software engineers who works to release non-vendor patches for Zero-day exploits. McAfee and Symantec deployed ZERT teams to the campus. It took 48 hours to identify the virus and release a DAT file that patched and inoculated against the virus.

When the smoke cleared, over 1900 desktops and 10 servers were infected with the mass mailing virus referred to as W32/Nuwar or W32/Mixor. It damaged Microsoft Office applications, including Word and Excel. Fortunately, our crisis management strategies worked or the damage would have been much more significant. Sphere: Related Content

Friday, November 17, 2006

Rock Concert 2.0: Mobcasting

Last night I went to see the Blue Man Group's "How to be a Megastar 2.0" tour. Yes, it appears that everything is 2.0 these days.

What makes this a 2.0 concert is that BMG concert attendees can take part in the performance by using the text messaging. At the start of last night's concert, the audience was invited to text the codeword, "blue" to Mobkastr, a service of from Counts Media

Throughout the performance (for a fee of $1.99) audience members were instructed, via an LED panels on stage, to text further codewords to the Mobkastr system. Subscribers were then fed a series of messages that interacted with the storyline.

I tried repeatedly to subscribe but was unsuccessful. I landed up sending several text messages to their tech support (who must have been swamped) who were trying to be helpful but were unable to figure out why the database would not accept my number. I was given several codewords to try, but to no avail. I probably spent $10 in messages trying to subscribe to the $2 service.

After returning home I began a stub for this post. A quick search revealed that this technology had a name: mobcasting. It is a play on the concept of mobile podcasting and Smart Mobs.

Imagine that after a very significant football game between heated rivals, say Ohio State vs. Michigan. During a victory celebration a group of individuals begin to vandalize property. A few journalists may be there to cover the event, but chances are that there are many more individuals with video phones.

Observers capture the event on their video phones - dozens of phones from dozens of angles - and immediately podcast the footage on a personal or community blog. The footage gets aggregated on a single website from RSS feeds produced by the podcasters' blogs. This leads to live event coverage by the bloggers, which can then lead to coverage by the mainstream media and possible identification of the vandals and possible prosecution.

That's mobcasting. Sphere: Related Content

Tuesday, November 14, 2006

The Changing Role of Cataloging?

I had several interesting conversations on a variety of library topics while visiting librarian friends this past weekend. One of the topics we covered focused on the future role of technical services and specifically, cataloging.

The one question we were exploring is if extensive bibliographic records, where the data may only be rarely used, are still needed? What are the opportunity costs of managing cataloging staff to create such records when bibliographic detail is available elsewhere?

As I began writing a blog stub on this discussion I came across Lloyd Sokvitne's paper Redesigning the OPAC: moving outside of the ILMS, presented at the Beyond the OPAC : future directions for Web-based catalogues seminar.

Sokvitne's discussion of The State Library of Tasmania's process of changing their OPAC design and functionality brought into sharp contrast the tension between the bibliographic data created to manage physical collections and the data actually needed to enable simple user-orientated discovery. The State Library's findings raise the question as to whether it makes sense to alter the nature of cataloging activity to focus on discovery needs rather than bibliographic detail.

Much of the traditional MARC record and bibliographic system is geared to meeting the needs of acquisitions, unique title/edition identification, and internal collection management and use, not discovery using current search tools. Bibliographic records provide very little assistance in providing supplementary information such as a book synopsis, reviews, recommendations, ratings, and popularity which can help a patron select an item.

Sokvitne points out:

"Ultimately it would be better if libraries could create and share this type of data amongst themselves, and thereby provide a commercial-free source of evaluative data and information. It would be easy to argue that this type of data sharing and reuse among libraries would be more valuable in the web world that (sic) the recurrent sharing of unnecessary bibliographic data."

"If only we could share and access that data so as to deliver the type of advisory, recommendation, and supplemental information that is now expected by our users to augment bibliographic data. This type of data sharing may be more important in the long run in terms of keeping our services relevant than any amount of sharing of bibliographic data."

Libraries could be better served using bibliographic data from WorldCat with links to local acquisitions and circulation systems. Libraries need to look hard and moving common resources, such as bibliographic data, to a network-based resources sharing level. We then need to look hard at reallocating technical services staff, specifically cataloging staff, to focus on creating discovery tools rather than bibliographic detail.

Sphere: Related Content

Wednesday, November 08, 2006

Common Components of Web 2.0 Platform

I came across the following while working on a book chapter about mashups. It appeared in an O'Reilly XML blog by Dan Zambonini and influenced by a Tim O'Reilly article.

It is a nice graphical representation of the various pieces and parts which make up programmable web applications:


Sphere: Related Content

Friday, November 03, 2006

Library Trading Spaces

In a previous post I discussed how the physical space in libraries may be seen by some as being more valuable then information resources that it houses.

The motivation for this thread was a discussion about the possible creation of a "Digital Library Task Force" which was suggested by senior leadership outside of the library. The idea may have been motivated by the Stanford University Libraries and Academic Information Resources SEQ2 Library Vision: The Information Collaboratory report was uncovered. This report states that in creating their new School of Engineering Center building that Stanford is "aiming for a bookless library " and that "eventually, the book collection will disappear altogether, the space that it occupied being re-configured for more study space, more collaborative spaces and perhaps more spaces for consultation. "

If I were in such a senior leadership position and walked into the library after reading the report I would probably see the stacks as dead space as well.

While we are approaching this as a great opportunity to educate our leadership, the reality is that we still need to look at our space in an effort to at least make the stacks less visible. We are not alone on this issue:

- During the summer of 2005 the University of Texas the word "library" was removed from the undergraduate library and the facility converted into the Flawn Academic Center. Almost all of the library's 90,000 volumes were dispersed to other university collections to clear for a 24-hour electronic information commons.

- A similar event occurred at the University of Tennessee when their serials rooms were converted into The Commons.

- On October 10th, 2006 the Mills Library at McMaster University opened a new Learning Commons. The Mills Learning Commons Project is the result of a partnership between the University Library, the Center for Leadership in Learning, the Center for Student Development, and University Technology Services.

While the Commons is the current trend in space reallocation, what else should libraries do with their space?

The July 11, 2006, issue of the Chronicle of Higher Education contains an article by Scott Carlson entitled "Campus Planners Have a Tech-Savvy Generation's Needs to Consider"(account required to access) it discusses students' preference for casual and active study space, the use of increasingly smaller electronic devices, and the importance of 'sanctuaries' and 'transitional space'. Some spaces should be flexible, with movable furniture that allows students to spread out. There should be ample space for writing and working.

Our library currently has 6 small study rooms that seat up to 6 comfortably. These are very heavily used. I have suggested doubling the number of these spaces. They do not have to be walled in as they are now, but partioned in a way that creates small rooms within a room. By not putting up walls the space can remain flexible.

I sure our library is not the only one where students drag (and occationally break) furniture into interesting arrangements. Our staff then drags the furniture back (hopefully not breaking it more) to more formal arrangements. Instead of waging a rearrangements war we should be looking at the arrangements they are creating. We should then build more spaces and purchase furniture that actually encourages rearrangement.

We also need to be looking within the our academic communities for potential tenants. In our case, we see great potential in having Educational Student Services housed in our facility. We also need to create new spaces that connect library resources/services with teaching and academic spaces.

So, what is your library doing with it's space? Sphere: Related Content

Wednesday, October 18, 2006

Is Space Becoming More Valuable Than Information Resources?

If you have not had the chance, make sure you read over Lorcan Dempsey's recent article in Araidne and associated blog posting on the discovery experience. While he once again weaves in the idea of libraries needing to be in the flow of our customer's information seeking patterns, this time Lorcan discusses how the resources contained within the walls of libraries are no longer scarce:

"In a pre-network world, where information resources were relatively scarce and attention relatively abundant, users built their workflow around the library. In a networked world, where information resources are relatively abundant, and attention is relatively scarce, we cannot expect this to happen. Indeed, the library needs to think about ways of building its resources around the user workflow. We cannot expect the user to come to the library any more; in fact, we cannot expect the user even to come to the library Web site any more."

This observation is so true.

I recently had the opportunity to attend a meeting with Lorcan on this concept. In the "old" days, the resources housed in libraries were scarce. The only place one could access the resources contained on the shelves was by visiting the physical library. In the networked world, resources themselves are plentiful.

The challenge that academic libraries will face very soon is that although resources are no longer scarce, space is becoming increasingly scarce. Chances are that planners are already looking at the stacks of materials within the walls of many academic libraries as dead space. Libraries need to look at the information commons, small group study spaces, and other academic support services which could be offered within the library in order to protect their space.

These new uses of the physical library space could bring customers back into the library. Maybe they will not be using the library as we have grown used to, but maybe they will once again find the value of entering our doors. Sphere: Related Content

Tuesday, October 17, 2006

What Do Henry Ford and the Online Catalog Have In Common?

While Henry Ford's assembly line process was revolutionary, it did have a downside. All of the machine tools were created specifically and fixed in place for the Model T. Due to the significant costs of re-tooling, the Model T did not change for almost two decades.

Similarly, traditional computer software development involves vertical programming architecture where everything required by the program including data, the core application, and the interface are all created and fixed within that program. The online catalog is a great example. The bibliographic data, the application which searches that data, and the customer interface are also fixed within the online catalog system.

The assembling line problem was solved by General Motors which utilized a flexible manufacturing approach in which sub-assemblies were created at different factories which used interchangeable tools. This allowed GM to make changes to any of the sub-assemblies without disrupting the entire manufacturing process. The manufacturing approach that Japanese automakers used to cripple the American automakers took the concept one step further. They interchanged parts between model lines.

In service oriented architecture (SOA), the data, application, and interface are separated so that each can be implemented using the best technologies for the task. The pieces can be interchanged or repurposed.

If one were to build an online catalog using this concept, each of the pieces of the online catalog would be separate software modules. Each would be designed using the best technology for the task. One could then replace the interface module without disrupting the application and bibliographic data processing modules.

As with the Model T the amount of resources required to re-tool from one online catalog system into another are so significant that libraries also rarely switch. One hopes it doesn't take libraries two decades to figure it out. Sphere: Related Content

More on Continuous Learning

Back in July, I jumped into the librarian continuing education thread with my post Continuous Learning: Making it a Priority...Period in response to the Continuous Learning: Making it a Priority Without Breaking the Bank post by Meredith Farkas.

Dave King has revised the discussion in his post Making Time for Web 2.0.

Learning new technologies is an essential part of the continuous learning process and should be a higher priority. Librarians also need to carve time out of their schedule to experiment as well as eat their own dog food. Sphere: Related Content

Monday, October 09, 2006

From Command and Control to Collaborate and Connect

Thanks to Michael Stephens for pointing out a World is Flat post by Will Richardson, “Learner in Chief” at Connective Learning and the author of Blogs, Wikis, Podcasts and Other Powerful Web Tools for Classrooms.

I would like to take the liberty of altering this post with a library POV:

"This is what happens when you move from a vertical (command and control) library system to a much more horizontal (connect and collaborate) flat library system. Your customer can do his and your job…Customers, if they are inclined, can collaborate more directly with more of their peers than ever before no matter who they are or where they are in the world…But librarians will also have to work much harder to be better informed than their customers. There are a lot more conversations between customers and librarians today that start like this: “I know that already! I Googled it myself. Now what do I do about it?” Sphere: Related Content

Tuesday, October 03, 2006

Moving Towards Consumption Management

Last week my home theatre audio receiver died. Over the weekend I began researching a replacement unit. (I'm not sure if the deep research of a future purchase is a male or a librarian trait). Since the dead unit was 10 years old, I had much to learn.

In the "old" days I either subscribed to different magazines or ran down to the public library to perform my research. Mail order companies like Crutchfield now have very helpful web sites that provide access to much more than a list of the technical specifics (watts per channel, sound processing standards supported, type of inputs) of various model available on the receiver group page. I was presented with a host of research options.

The site allowed me to compare any receiver models, read reviews, or narrow my search by a host of options such as those with HDMI switching. I was able to view photos of the fronts and backs of the boxes or take advantage of a context sensitive Hands On Research, which provides descriptions of all the features in that product category (including why I might need HDMI switching) . There is also a Home System Planning Center as well as an DIY Installation Center.

This exercise got me thinking about a Lorcan Dempsey posting about Bjørn Olstad's presentation at Ticer entitled the Advances in Search Driving Library 2.0. ( link to pdf download) This is the same Ticer program which Michael Stephens and Jenny Levine also presented.

As Lorcan points out, Olstad discusses a move from content management to consumption management. The example used is the disruptive change that Yellow Pages are experiencing - a move from a provider view with few details and a shallow understanding to a enriched consumer view with recommendations, maps, comparisons.

This was exactly my experience in research home theatre receivers. I was no longer limited view of audio receivers containing a few details and a shallow understanding of the products. I was presented an enriched product view with recommendations, comparisons, and how to guides -- all linked from, and relative to, any single product. I found out all I needed to know to make a decision from this one site.

Unfortunately for Crutchfield, they did not have the best price. Sphere: Related Content

Monday, September 25, 2006

Why Don't Men Ask for Directions?

Why do men always control the television remote?

Why does a man has the ability to identify a sports car model a mile away but can not find the ketchup in the refrigerator?

I thought about these and other questions as I was contemplating Roy Tennant's August 15th, 2006 piece in Library Journal which focuses on the Gender Gap in digital library development.

I had to read the piece a couple time since I was unsure if the point of the article. Was is about environmental factors were preventing women from becoming involved in digital library projects? Or, was it was questioning why women were not out there leading the way at conferences, etc.?

I feel any perceived shortage of women in digital libraries has little to do with libraries, their culture, and work environments for female colleagues. Instead, it may have to do with our educational system and gender specific learning styles.

When I do an environmental scan I see many examples of women involved with library technology / digital libraries:

The interest that boys have with technology starts with the games and toys they play with. Today's video games are simply yesterday's erector sets or Lincoln Logs. By the time males reach the undergraduate level they are comfortable with technology environment and culture. Young girls have not had the same exposure to the technology and feel less confident at the same point in life.

Males tending to have an explorer-type mentality, finding interest in just playing around with the computer to find out the capabilities. Women tend to prefer working towards a goal or end.

This difference in approach could also help explain why don't men ask for directions.

Introductory computing courses typically assign programming projects that may lack purpose or meaning to female students. Considering that females may begin technology programs with less exposure than males, their learning strategies may be less effective for skills like programming. Courses which encourage learning through repetitive exercise and projects without a direct application may discourage females from continuing with the major.

In the end, the problem may be the lack of a technology learning model that fits with the needs of women. Sphere: Related Content

Friday, September 15, 2006

If "They" Build It, Will "They" Come?

In the "early days" of the web site development it was common to adopt the line whispered to Kevin Costner in Field of Dreams - "If you build it, he will come."

Within the past 12-18 months the attention has turned to social and participatory networks. A growing number of library customers are now using discovery tools and information seeking patterns that do not involve the library. In fact, the concepts of findability and getting in the flow have become very important.

The question being whispered now is "If they build it, will they come?" The question I am starting to ask is if web sites with content built by the user the answer for libraries? An emerging rule of thumb may suggest they are not.

Jimmy Wales, the founder of Wikipedia, introduced in a 2004 presentation the 80/10 Rule. In Wikipedia 10% of all "logged in users" make 80% of all edits, 5% of all users make 66% of edits with half of all edits are made by just 2.5% of all users. This is also supported by Bradley Horowitz of Yahoo (once a grad student at MIT's famous MediaLab) who points out that inYahoo: in Yahoo Groups, the discussion lists, "1% of the user population might start a group; 10% of the user population might participate actively, and actually author content, whether starting a thread or responding to a thread-in-progress; 100% of the user population benefits from the activities of the above groups."

The emerging rule is that of a group of 100 people online only one will create content, 10 will interact with it (commenting or offering improvements) and the other 89 will simply view it. This fomula seems to hold true with other attempts by libraries to create systems which customers can interact, such as MyLibrary.

After evaluting the MyLibrary service after three years at Virginia Commonwealth, James Ghaphery reported that 4% of the user accounts accounted for 60% of the use of the advanced features (similar to Wikipedia's 5% of users making 66% of the edits). Additionally, Ghaperty reported that 62% of the established accounts used the features two times or less. Similarly, only 4% of the total population at North Carolina State took advantage of the system.

While user centered sites are great in concept, the question is how many customers will actually take advantage of the features? Does the low percentage of customers actually using the advanced features in exsisting customer / user driven warrant the cost in time and resources to build it? By the time such a site is conceived, built and deployed will the paradigm have changed yet again?

This is where the wonderful world of web services and mash-ups may fit in. Instead of developing large scale local systems should we instead be looking at ways to leverage services like WorldCat at let organizations like OCLC do all the large scale stuff. Libraries could then focus on building light weight throw away applications which mash that data to create local services. The social software concept would be a feature but not the focus of such systems. I will be participating in a discussion with an OCLC representative on Sept 18th on ways to use WorldCat in this manner and will post anything interesting that emerges.

References:

Ghaphery, James. "My Library at Virginia Commonwealth University" D-Lib Magazine
July/August 2002 8(7/8). Available at: http://www.dlib.org/dlib/july02/ghaphery/07ghaphery.html

Gibbons, Susan. "Building Upon the MyLibrary Concept to Better Meet the Information Needs of College Students" D-Lib Magazine March 2003 9(3). Available at: http://www.dlib.org.ar/dlib/march03/gibbons/03gibbons.html Sphere: Related Content

Tuesday, September 12, 2006

Do Any Librarians Out There Cha-Cha?

Almost 10 years ago I flew to Santa Clara to attend the World Wide Web 6 Conference. I took with me the March 1997 issue of Scientific American with a series of very interesting Internet related articles.

One of the articles described a new search engine concept. What caught my attention was that the algorithm being used sounded a great deal like Science Citation Index Impact factors. The use of a common library concept for analyzing Internet search results caught my attention. What eventually grew from that concept was a large-scale hypertextual search engine.

The other day I came across a new experimental search tool called ChaCha.

Developed by Scott Jones and Brad Bostic, ChaCha provides two search options. The "Guide" option should be of interest to librarians. By searching with a Guide the query is sent to a "real person" who is "skilled at finding information on the internet and knowledgeable on the subject at hand." "Once connected to a Guide you can chat with him/her to clarify your question. Discussing your question will get more precise results than any other search engine can deliver."

Sounds like a librarian to me. Again, the use of a common library concept for analyzing Internet search results caught my attention.

However, upon further review, ChaCha "Guides" are not librarians or necessarily any level of information professional. The can be college students, retirees, stay-at-home moms. "Guides" are employed as independent consultants in a type of multilevel marketing scheme. In time, Guides recruit other Guides and receive a part of the recruited Guide's earnings. Think Amway.

Newly recruited Guides are matched with areas of personal interest and expertise and assigned a mentor. These "apprentices" do not initially interact with the public and to become "pros" they must pass tests for speed, quality and accuracy. "Pros" interact with the public and are paid $5 per search hour. Those which achieve "Master" level guides are eligible to earn 10% of those they have brought into the ChaCha Underground (the community of guides). Elite-level guides make $10 per search hour.

ChaCha is ad supported by display advertisers and sponsored links.

So, are any librarians out there a part of the ChaCha Underground? Sphere: Related Content

Tuesday, September 05, 2006

GPL May Be Tested in Israeli Court

A GPL dispute appears to be headed towards the courts in Israel. The dispute is over a Java client for chess servers. The original author of the program Jin is challenging the use of their OSS software in the creation of IChessU, a for-profit entity.

From I can make out, Jin's creator Alexander Maryanovsky's problem with IChessU is that while IChessU has utilized Jin's code, they are not distributing Jin's entire source code. An A/V module in Jin is not being used by IChessU and therefore the source code is not included. Based on the available documentation, IChessU's Alexander Rabinovich appears to argue that they are distributing the IChessU derivative source code including the Jin code that was used.

It appears that Jin's creator feels that IChessU needs to distribute the entire Jin source code regardless of if all the code is used. From my non-legal interpretation, IChessU appears to be following the GPL license. I may not be understanding Jin's argument.

The GPL states that the derivative must be distributed with the entire source code of the derivative along with a copy of the license, which IChessU appears to be doing. The GPL also states "You may modify your copy or copies of the Program or any portion of it..."The GPL does not specify that the derivative has to distribute the entire source code of the original program.

The GPL is also clear that "any work that you distribute or publish, that in whole or in part contains or is derived from the Program or any part thereof, to be licensed as a whole at no charge to all third parties under the terms of this License." A violation would occur if IChessU begins to charge for the code anything more than the cost of the physical distribution of the code. Sphere: Related Content

Wednesday, August 30, 2006

RedLightGreen to Cease Nov 1

Michael Brown at the University of California, Irving has posted a message from Merrilee Profitt Program Officer, RLG Programs, OCLC Programs and Research announcing that RedLightGreen will be "transitioning" to WorldCat as of November 1st. This is fallout from the recent OCLC/RLG partnership.

From that message:

"The only key feature that distinguished RedLightGreen from WorldCat.org was citation formatting, and the WorldCat.org development team quickly acknowledged that this feature would be a useful addition to WorldCat.org and are working quickly to make this feature available. Below is an overview of RedLightGreen features, and how they are covered in WorldCat.org.
  • FRBRization of results: RedLightGreen uses a FRBR-like approach to group works in RedLightGreen. OCLC is already employing a similar FRBR-like approach in Open WorldCat. Grouping of works is slightly different than in RedLightGreen; for example, titles in different languages are treated as separate works.
  • Ranking: RedLightGreen orders results by a combination of relevancy to search term and how widely held a work is among contributors to the RLG Union Catalog. Open WorldCat uses a similar approach, weighting the terms in certain fields and the currency of a work, along with how many holdings a work has.
  • Faceted display: RedLightGreen offers users facets for narrowing search results for subject, author, and language. WorldCat.org also offers facets as a way of narrowing a result set; currently facets include author, content, format, language, and year.
  • Citation formatting: RedLightGreen offers a very popular bibliographic citation feature. OCLC plans to offer citation formatting based on the RedLightGreen feature -- you can look for this in early 2007."
Sphere: Related Content

An Example of the Value of Scholarly Blogging

I will be participating in two panel sessions in early 2007. I was invited to participate in both primarly as the result of this blog.

At the 2007 ACRL Conference being held in Baltimore March 29th - April 1st I will be on a panel entitled "Technology Innovation in Academic Libraries: Rocking the Boat or Unfurling the Sails?" I will also be appearing on a panel discussing technology trends at the Medical Library Association Annual Meeting in Philadelphia May 18th - 23rd.

Over the past couple months I have written several postings regarding blogging as scholarly communication. Blogging promotes the national and/or international recognition that at the heart of the promotion and tenure process consistent with the mission of most academic organizations. The fact that this blog caught the attention of meeting organizers demonstrates this value.

I am sure many other library bloggers have had similar experiences. Sphere: Related Content

Friday, August 25, 2006

Are OPAC Vendors Days Numbered?

I just did a quick scan of the study report Software and Collaboration in Higher Education: A Study of Open Source Software by Paul N. Couran (Principal Investigator) and Rebecca J. Griffith

The Study was funded by The Andrew W. Mellon Foundation, Carnegie Mellon University, Foothill-De Anza Community College, Marist College, Indiana University, the University of Michigan, Stanford University, the University of North Carolina, and The William and Flora Hewlett Foundation

The report references librariy OPACs:

"These systems are used to catalogue library holdings. There are a number of commercial products available (Ex Libris, Endeavor), but consensus seems to be that these systems are clunky and outdated. One theory we heard is that vendors are reluctant to invest in upgrading these systems because the function of libraries is in such a state of transition, and it is not at all clear what activities the software will need to support five to ten years form now. A number of people speculated that an open source OPAC would make sense, though the same challenges would apply."


As is the lifecycle of all to many software companies, the reluctance of vendors to keep their systems up to date results in a dwindling of licensees. In time, these companies are financially supported by a handleful of licensees who pay an increasing amount for support and customization. The licensee becomes trapped by the vendor in since they have a large investment in the system!

The report also discusses the concept of an incubator of OSS projects. The benefit of this model is that it would provide a legal home for open source projects and reduce the overhead costs associated with setting up separate non-profit organizations for each one. While a different cocept that the open source resource sharing networks I have previously discussed, but is in the same spirit.

I suspect the the combination of open source and the reluctance of vendors to keep their systems up to date will result result in the demise of significant number of commerical library vendors in the next five years. The poor performance and outdated products of commercial OPAC products is due largely to the disconnect between developers in software firms and their customers. This should be an advantage to library developers, and the timing to look at open source networks/incubators is ripe. Sphere: Related Content

Wednesday, August 23, 2006

Has "Google" Always Been A Verb To You?

Each August since 1998, as faculty prepare for the academic year, Beloit College in Wisconsin has released the Beloit College Mindset List ®. Beloit creates the list to share with its faculty in anticipation of the first-year seminars and orientation. Those entering college were mostly born in 1988 and:

8. They are wireless, yet always connected.
19. "Google" has always been a verb.
20. Text messaging is their email.
23. Bar codes have always been on everything, from library cards and snail mail to retail items.
38. Being techno-savvy has always been inversely proportional to age.

Other technology notes of interest:

7. They have never heard anyone actually "ring it up" on a cash register.
28. Carbon copies are oddities found in their grandparents' attics.
36. They have rarely mailed anything using a stamp.


Categories

Sphere: Related Content

Monday, August 21, 2006

Is Your Library Organizationally Healthy?

I have had several interesting discussions recently about how a library's organization and culture are critical to its ability to innovate. (This topic thread was echoed in the recent Culture of No posting by Steve over at the Blog about Libraries.)

In browsing around on the topic, I came across a report entitled A Global Check-Up: Diagnosing the Health of Today's Organizations. As I read the report the idea that the "health" of a library (or any organization) is important to its ability to deal with technology made a lot of sense. However, this is not a new concept. It has been discussed in various contexts for decades.

Seven organizational types are identified in the report with three considered being "healthy":
  • Resilient: Flexible enough to adapt quickly to external market shifts, yet steadfastly focused on and aligned behind a coherent business strategy.

  • Just-in-Time: Inconsistently prepared for change, but can turn on a dime when necessary, without losing sight of the big picture.

  • Military Precision: Often driven by a small, involved senior team, it succeeds through superior execution and the efficiency of its operating model.

The four organizational profiles were identified as "unhealthy":

  • Passive-Aggressive: Congenial and seemingly conflict-free, this organization builds consensus easily but struggles to implement agreed-upon plans.

  • Outgrown: Too large and complex to be effectively controlled by a small team, it has yet to "democratize" decision-making authority.

  • Overmanaged: Multiple layers of management create "analysis paralysis" in a frequently bureaucratic and highly political environment.

  • Fits-and-Starts: Contains scores of smart, motivated and talented people who rarely pull in the same direction at the same time.
In an unhealthy organization:
  • Culture is dominated by a few personalities that plan and act based on their own personal agendas. (Culture of No?)
  • Reactive planning. Change results from managing a crisis.
  • Organization lacks clear decision rights and doesn't share information effectively.
  • Administration does not articulate a mission and vision.
  • Staff and committees are given responsibilities but not given final decision making authority. (Culture of No?)
  • Decision making appears to be participative, but final decisions do not reflect the input and feedback. (Culture of No?)
  • Administration sees a far rosier picture than the rest of the organization.
  • Processes and procedures impede rather than facilitate.
  • Library administration seeks a passive resolution to unhealthy situations. They let the "kids" figure things out.
  • Lack of communication between divisions; lack of sanctions for non-communication.
Does any of this sound familiar? Sphere: Related Content

Wednesday, August 16, 2006

Scholarly Blogging: The Quiet Revolution

A new book by Axel Bruns and Joanne Jacobs entitled Uses of Blogs was recently released. It contains a chapter by Alex Halavais entitled "Scholarly Blogging: Moving Toward the Visible College."

In the chapter Halavais writes:

"We are in the midst of a quiet, uneven revolution in academic discourse, and blogging and other forms of social computing make up an important part of that revolution. We may filter our view of blogging through a set of archetypal scholarly communication settings: the notebook, the coffee house, and the editorial page. For now, scholarly blogs are a bit of each of these, while they are in the process of becoming something that will be equally familiar, but wholly new."

I need to get a copy of the book to see if he says anything about acceptance by tenure and promotion committees as well as quality indicators and impact factors. Sphere: Related Content

Monday, August 14, 2006

Blog Quality Indicators and Impact Factors

A couple months ago in my post Is Blogging Scholarly Communication? I argued that blogging can be a significant form of scholarly communication. It meets the goal of scholarship and service that leads to national and/or international recognition that at the heart of the promotion and tenure process and is consistent with the mission of most academic organizations.

In the post I highlighted several issues which are barriers for academics. One of the concerns was quality indicators. How does one quantify a blog's impact?

Walt Crawford's thoughtful approach to identify the "reach" of librarianship oriented blogs provides a few interesting ideas. Walt's study is a followup to his 2005 study. He comes up with several Top lists. Walt is very clear that his list is not the Top 50, but a Top 50 which was primed with his own Bloglines subscriptions. With over 554 liblogs to work from, Walt "drained the pool" based on the number of subscriptions and the number of links found in Google and MSN Search.

The problem with today's environment is that there are so many aggregators and search tools indexing blogs that pulling together information to make such an analysis is extremely time consuming. I do not know how much time it took Walt to compile his study, but I will assume it was much more than anyone could suspect.

The following are some of the criteria used in his study and my comments:
  • Frequency of Posts. The frequency of posts by a blogger is a topic that has had some recent discussion, sparked by a post by Eric Kintz. As Kintz points out, frequent posting creates the equivalent of a blogging landfill. According to Technorati, only 11% of all blogs update weekly or more. While an interesting stat, frequency does not provide much as a quality indicator as much as it indicates some level of proficiency.

  • Comments. The number of comments on a post does provide some insight into which posts are hot topics or hit a particular nerve. Walt refers to comments as "Conversational Intensity." The theory here is that interesting or controversial posts will result in a higher number of comments.

    As Walt points out there are blogs that do not have comments activated, which causes some problems. A concern I have is that within certain communities a core group of bloggers will comment on each other's postings, which is similar to citing a friend's work. Authors will also respond to each comment posted. Both these behaviors will artificially inflate the comment total.

    Still, I view comments (and topic spawned posts) to be the blogging equal to peer-review. In many respects, this post-review comment process may be more critical and may advance concepts further and faster than traditional peer review and publication process. The challenge is encouraging"quality" comments and getting buy-in from the academic "traditionalists."

  • Length of Posts. Posts longer than the average of 268.5 words were classified by Walt as "essays" and those less than a quarter of the average as "terse." The question is which approach has more impact. Since I am discussing scholarly communication I would propose that essays would have a higher impact. However, sometimes a three page article will have more of an impact that a 50 page article.
While the number of comments does indicate that a post has some value, the Number of Links to a specific Blog or post is perhaps the strongest of indicators. This would similar to the Citation Index analysis approach. Citation analysis is one of the most widely accepted quality indicators. While Walt used a minimum number of links to drain his pool, linking was not used in his final analysis. This metric is not without issues, such as a blogger inflating their site's value by linking to themselves, a tactic similar to someone citing themselves.

I would like to thank Walt for his analysis since the issue of blog quality indicators and impact factors is an issue I am very interested in now that I am on the local promotion and tenure committee. As with any such analysis, coming up with a set of metrics in an effort to identify quality is a challenge. This is certainly a great stepping off point for future discussions.

I will have to wait and see how many comments and links this essay receives.


Additional Resources

Michael Stephens. Evaluating LIS Weblogs
LISFeeds
LISWiki
DMOZ/Open Directory Sphere: Related Content

Monday, August 07, 2006

The Perfect Storm: Did Libraries Miss the Weather Reports?

I had a chance to start reading a 2005 book from three time Pulitzer prize New York Times columnist Thomas L. Friedman entitled The World Is Flat: A Brief History of the Twenty-First Century. While the book has received mixed reviews it has certainly sparked conversation.

Friedman describes how an unplanned cascade of events and advances in technology and communications has effectively leveled the economic world. He contends that although the dot-com bubble and subsequent bust was bad for some investors, it was beneficial in opening up world markets. The overcapacity which produced the bust reduced cost of entry and enabled players from marginal regions, like China and India, to get into those markets.

Friedman discusses how specific events converged around the year 2000, and "“created a flat world: a global, web-enabled platform for multiple forms of sharing knowledge and work, irrespective of time, distance, geography and increasingly, language." A"“political perfect storm" including the dotcom bust, 9/11, and Enron "“distract us completely as a country. Just when we need to face the fact of globalization and the need to compete in a new world, “we we’re looking totally elsewhere."

In thinking about this, the library world has also experienced a perfect storm of events which has flattened the information world. Just when we need to face the fact we needed to compete in a new networked world, we too may have been looking elsewhere when:
  • NSFNet: The first TCP/IP network was opened to commercial interests in 1985.
  • WWW: CERN first publicizes Tim Berners-Lee's work in 1991. Allows anyone to publish online.
  • NCSA Mosaic 1.0: The first graphical web browser released in 1993 gave users easy access to information resources anywhere on the Internet.
  • Google: Started as a result of a personal argument and in 1996 and grew into a research project called BackRub. Recent forays into the library space including Google Print, Google Scholar, and Google Library are having a disruptive effect on libraries.
  • Open Source: Is the infrastructure under many Web 2.0 services. Yet, most libraries have not embraced the approach and still rely on vendor-based proprietary systems.
  • Social Software: The rise in popularity of MySpace and other social software based services are creating new information sharing networks outside librarysphere. The rise of social tagging.
  • Wireless: Devices can access information resources from anywhere. Library customers are no longer dependent on the library as building.
There are librarians out there that may view all of the above as being bad for the future of libraries. I, however, feel they have been extremely beneficial in opening up world of information. The fact that people are using Google, consuming information, and are creating and posting content (and tagging it) is the important point.

Some librarians are very anxious over the ideas of social tagging, wikipedias, and customers not entering the library building. These tools and the resulting information seeking behaviors are now a part of the information landscape. Or, as Lorcan Dempsey points out, they are now a part of the "lifeflow":

"We have begun to realize more keenly that the library needs to co-evolve with user behaviors. This means that understanding the way in which research, learning, and consumer behaviors are changing is key to understanding how libraries must respond. And as network behavior is increasingly supported by workflow and resource integration services, the library must think about how to make its services available to those workflows."

Libraries have reached a new crisis point in their evolution. If we continue to stay outside the information flow of our customers then libraries will have a very big crisis on our hands, one which we may never recover. It is therefore more critical now then ever for library leaders to adopt a new vision, reallocate resources, and create new service models and get back in the flow. Sphere: Related Content

Monday, July 31, 2006

Is Your Web Site's Metadata Being Used?

I recently rediscovered a June 2005 article entitled Optimising Metadata to Make High-Value Content more Accessible to Google Users by Alan Dawson and Val Hamilton.

The article details a study that revealed that the constructing a web page title tag from the content of four metadata fields (title, type, author, date) is the most important step in Google optimization. While HTML has a "meta" tag which is often (painstakingly) used by libraries, its value is currently negligible in a Google dominated search engine world due to misuse.

According to the authors, even if web page includes meta tags for a description and other data they are not used by the public Google engine. (Google appliances do index meta tags). Instead of using the meta tags the public Google extracts "snippets" (the official term) from the full text of documents to serve as page summaries. The value of Google snippets as descriptions and in indexing is highly variable.

Google rankings can be improved by including the content that one would place in the meta fields as visible text - not HTML embedded tags - near the start of the page. They suggest placing the text beneath the title and author name. This increases the odds that when a search term matches a word in the Google snippet will include the term in the description, or at least at the start of it.

While the top of each web page may begin to look like the start of a catalog record, this approach not only makes the metadata visible to searchers it increases the odds that it is actually being used. Sphere: Related Content

Wednesday, July 26, 2006

Continuous Learning: Making it a Priority...Period

In Continuous Learning: Making it a Priority Without Breaking the Bank , Meredith Farkas at TechEssence writes how librarians need to continue to learn, to grow, to expand our views of libraries and technologies. Her post is on dead on point.

Meredith's post focused on what administrators should do to support learning. I would like to build upon her post but focus on what librarians should be doing themselves, regardless of administrator support.

Professional Literature: Meredith points to freely available online journals that would benefit librarians such as Ariadne, D-Lib, College and Research Libraries, Cites and Insights, Library Journal, School Library Journal and Educause Review. In addition, she points to DLIST and E-LIS as repositories for scholarly communications. Knowing these tools are out there is one thing, carving of time on a regular basis to actually read them is another.

Blogs: There are a growing number of blogs out there that focus on librarianship and applied library technologies. As Meredith points out the great thing about blogs is that there is no editorial delay. While some may feel that the lack or peer or editorial review weakens professional communication, bloggers report on events or information as it is happening.

Webcasts: A growing number of organizations that are offering free access to webcasts which allow librarians to interact in real-time with other professionals. While some services may only offer free archive access they are still valuable learning tools. Free webcasts are available through OPAL, the SirsiDynix Institute, the Blended Librarian community, and InfoPeople. WebEX is one of the larger pay-per-use commercial hosting sites where use is paid for by the presenter, not the attendees.

Podcasts: Podcasts are a convenient tool for learning. At this time few librarians are using podcasts, as they do blogs, to inform and educate their colleagues. Podcasts relevant to librarians include The Library Channel from Arizona State University, LiS Radio from the University of Missouri, SLIS Media Feed from Indiana University, Check This Out from Jim Milles (University at Buffalo Law School), Open Stacks, and Talking with Talis. OPAL and the SirsiDynix Institute also make their webcasts available in podcast format.

Here are some ways librarians can integrate technology/research into their daily workflow:

  • Create tasks in your planner as a reminder to review online (or print) resources on a weekly basis. Sometimes workflow makes them a low priority, but the reminders will prompt you to at least browse them on a regular basis.
  • Librarians should set up a Bloglines (or any aggregrator) account and begin learning how to use RSS feeds. Not only will the content be useful, but learning how RSS feed aggregators work is becoming an essential skill. If needed, one can take advantage of RSS online tutorials.
  • Contribute to Wikipedia. Some write off this as a authoritative source since anyone can contribute. Not only can you help make it more authoritative, you will learn how to use a wiki along the way.
  • Attend webcasts. Invite colleagues to join you and make it a journal club gathering.
  • Download podcasts. I often grab my iPod and lunch and go outside on nice days, particularly from The Library Channel. I have also been known to listen to podcasts while mowing the lawn.
  • Set aside some time at various meetings to discuss interesting developments in librarianship and/or technology. Have people share what they have read or played around with that is particularly interesting to them.
Learning new technologies is an essential part of the continuous learning process and should be a higher priority. Librarians also need to carve time out of their schedule to experiment and eat their own dog food.

The challenge is that library professionals that would benefit most from this advice are not likely to read this (or any other) blog. Hopefully those people will pick up Michael Stephens' Web 2.0 and Libraries: Best Practice for Social Software. Sphere: Related Content

Monday, July 24, 2006

Carnival of the Infosciences #47

Come one, come all to Carnival of the Infosciences #46. (Note: Was scheduled as #47 but the production schedule changed on me. I would change the title but it would break any links....) The entries here were selected from my blogroll and from items uncovered doing Technorati searches on library oriented subject tags or were submitted by fellow bloggers. The editorial choices are simply topics or points of view I found interesting.

Submissions:La Grande Wheel

Rick Roche is taking on a very ambitious project in bulding the Practicing Librarians' Book Review wiki. The site provides a forum for librarians to write about books they have read and communicate their thoughts to the library community without having to be accepted by a journal. Becoming a contributor to the wiki is as easy as signing up and logging on. Good luck, Rick!

Steve Matthews at the Vancouver Law Librarian blog provides a general summary in Drupal & An Introduction to Open Source CMS Products. Drupal is a content management system build for LAMP. (Linux, Apache, MySQL, and PHP for those unfamilliar with the acronym.)

Editorial picks:

  • At Library Juice Rory Litwin discusses "Wikipedia and Why Librarians Make Good Wikipedia Contributors". Wikipedia is cited by some librarians as a non-authoratative source because of the lack of content and editorial control, but Rory argues that librarians should take a more active role, which I agree:

  • "Initially I didn’t see much of a connection between librarianship and Wikipedia editing, because working on an encyclopedia seemed to me to be more of a writer’s or a researcher’s pastime than a librarian’s. As I got into it, however, I realized that the standards for writing a Wikipedia article are similar to a reference librarian’s approach to answering a reference question, especially in relation to one of the main “pillars” of Wikipedia: the “Neutral Point of View,” or NPOV, policy."


  • StevenB posted to ACRLLog his reaction to a Boston Globe editorial about the "Catered Generation". There are probably many Carnival readers that agree with his perspective:

  • "Is [sic] seems our profession has likewise become preoccupied with discovering methods to provide students with the lowest-common denominator research tools and the elimination of anything that might be perceived as too complex for fear that students will - what - complain that libraries are too hard to use. Do we fear that students will abandon our resources for the ones that do coddle them by eliminating the possibility of failure? It’s almost impossible with most search engines, no matter how awful your search is, to get nothing in return. You can’t fail. With a library database if you do a poorly conceived search you will likely retrieve nothing - the equivalent of failure. Heaven forbid we might expect someone to show some resolve and actually think about what they did and try to improve upon it - even if the cause of failure is as minor as a mispelled word."


  • In Continuous Learning: Making it a Priority Without Breaking the Bank , Meredith Farkas at TechEssence writes how librarians need to continue to learn, to grow, to expand our views of libraries and technologies. As library professionals this learning process needs to happen every day, every week, and every year that we are on the job. Perhaps most importantly this activity should be encouraged by administrators as an integral part of our work:

    "Administrators should encourage all employees to continue developing their skills and knowledge in this rapidly changing field. It should be just as much a part of our job as attending meetings, serving on committees, and other basic responsibilities."

    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    Next week's carnival (#47) will be hosted by Woody over at ISHUSH.
    For future dates, check the wiki. Make sure to take a turn hosting as well!!!

    "La Grande Wheel" is by laanba and published by rights granted through Creative Commons Attribution-NonCommercial-NoDerivs 2.0

  • Sphere: Related Content

    Wednesday, July 19, 2006

    Why Librarians Should Eat Their Own Dog Food

    I am on a local task force investigating the deployment of the next version of Microsoft's Sharepoint Service. During the first meeting the convener kept referring to us as the dog food group and that we would be "eating our own dog food." Since I had not heard this phrase before I did what any librarian would do - I went to the Wikipedia and Google.

    The metaphor is that a company "eating its own dog food" not merely considers the value of the product for consumers (i.e. whether the dog will eat the dog food), but is a consumer of the product. As a great example, a recent article in BetaNews describes how Microsoft no longer releases any software that it is not first using in production in house. In fact, Microsoft indicates this approach ultimately saves the company staff time!

    In software development to "eat [one's] own dog food" refers to a point at which a product under development is delivered, in rough state, to all on the project for use. These early versions may contain bugs, crash often, lose data or otherwise be unusable. It is a way of verifying that the product works under real-world conditions. This approach makes the staff feel the pain before our customers feel it.

    The problem as I see it is that all too often librarians come up with ideas for services but really do not fully understand all the aspects of (including technical support!!!) before moving ahead with planning and deployment. The library using it's own products and services before general availability has four primary benefits:

    1. Librarians are familiar with using the products and services they develop.

    2. Library staff have direct knowledge and experience with its products and services.

    3. Customers see that the library has confidence in its own products and services.

    4. Library staff, with perhaps a very wide set of technical skills, are able to discover and report bugs in the products and services before they are released to the customers.
    So, as a standard part of the innovation process libraries should indeed be eating their own technology dog food first in order to see how good it is. We should be actively using some aspect of the products and services we offer in a production environment before we offer them to our customers.

    For example:

    • To gain a better understanding of how IM would work for virtual reference services shouldn't librarians be using IM in other aspects of their jobs; e.g. communicating with each other?

    • To support wireless and mobile devices staff should be using them to perform basic staff functions, such as working in the stacks?
    • If a library offers a laptop distribution service shouldn't the staff be using laptops themselves?

    • If a library provides digitization hardware for customer use shouldn't the staff assisting these customers be engaged in digitization projects?

    • If the library is considering rolling out RSS feeds wouldn't it make sense that library staff be using them first?


    Sure, it will take time for the staff to get use to the taste and texture. Sure, they may not like it at first and wonder why they can't go back to their old food.

    In the end we should never expect our customers to eat what we are not willing to eat ourselves. Sphere: Related Content

    Monday, July 17, 2006

    CAPTCHA !

    If you ever purchased tickets online or even posted blog comments then odds are that you have used CAPTCHA (TM by Carnegie Mellon University) but did not know that the technology had a name or that a large NSA funded project is behind it.

    CAPTCHA stands for "Completely Automated Public Turing test to tell Computers and Humans Apart" and is a challenge-response test used in computing to determine whether or not the user is human. The most common type of CAPTCHA displays an image containing distorted letters of a word or some sequence of letters and numbers. The user then needs to type the letters of a distorted image.

    The different approaches include GIMPY , BONGO , and PIX . Sphere: Related Content

    The (infosciences) Carnival is Coming (to town)

    The Medium is proud to be hosting the Carnival of the Infosciences #47 , which will be posted next Monday, July 24th.

    Have you read or authored anything that you wish to share?



    Please submit your posts to eric.schnell [at] gmail.com with the word "Carnival" in the subject line.

    Closing date is Sunday July 23rd at midnight.

    "Carousel" is by Frottage Cheese and published with rights granted through Creative Commons Attribution-NonCommercial-NoDerivs 2.0
    Sphere: Related Content

    Wednesday, July 12, 2006

    Technology and Library Staff Buy-in: Update 1

    In an earlier post I discussed the concept of how the creation of a technology group could help the process of fostering innovation.

    Based on ideas gathered from others, my original concept was to create an informal group consisting of techies that would get together on a regular basis to brainstorm and play around in a sandbox. Any practical ideas would be communicated to anyone we felt would be interested. I discussed the idea to library administration and they where supportive.

    The reality became that most of us already had so much on our plates (sound familiar?) that we couldn't find a good time to get together on a recurring basis. Therefore, an informal approach to such a group is challenging since "real" responsibilities take precident (much like ALA committee work).

    Timing is everything. Upon returning from ALA our administration had a unexpected renewed interest in emerging/disruptive technology. I was approached with the idea of formalizing the group, which I jumped. I proposed moving ahead with my original concept, but having a formal charge which will make participation a part of the group members responsibilities.

    One challenge is that the desire to create formal processes and procedures for the group was expressed. The concept presented was that the ideas and technologies would go to a larger leadership group for a formal vote on what to continue exploring.

    Argh! I feel this approach is very problematic. One can not manage technology by majority rule. The technologies that would make it through such a process would not be those that are the most innovative. I stand by my viewpoint that such a group has to work outside normal processes.

    I will make sure to post updates as things progress.

    "Technology is like a fish. The longer it stays on the shelf the less desirable it becomes" - Andrew Heller Sphere: Related Content

    Monday, July 10, 2006

    Are Most Library Instruction Programs Antiquated?

    I remember one of the most unpopular courses in the school library media track in grad school was a nonprint production class. Most of the students disliked it because they didn't understand why knowing how to create nonprint materials was relevant to librarianship. In hindsight, especially in today's environment, many of those students feel that knowing how nonprint materials were created helped make them better selectors and users of nonprint information resources.

    As I was browsing through the LOEX 2005 Conference site and came across a presentation entitled Information Literacy Isn't Enough: Why Librarians Need to Teach More in the Digital Age by Rob Withers and Lisa Santucci of Miami (OH) University that seemed to echo this approach. Their basic premise is that libraries need to move their instructional programs beyond information literacy since "“If you have a hammer, everything becomes a nail."

    The traditional approach taken in most library instruction programs includes providing step-by-step instructions on using the various systems. Withers and Santucci's approach focuses on gaining an understanding of how information is created and on the information environment in general. The customer gains a better understanding about information from the perspective of a consumer and a producer. They learn how information, and misinformation, can be created and how decisions by creators impact the ability to locate and use information.

    I agree.

    Since library and information systems in general are constantly changing it no longer makes sense to teach our customers how to use the systems. Instead, it makes a great deal of sense that we should be teaching our customers how to create information will help them be better selectors and users of information.

    The greatest challenge will not be the customer's acceptance of such a program, but upgrading librarian attitudes towards new and unfamiliar technologies. For such an approach to succeed librarians also need to first learn how to create content themselves. Such a requirement may be as unpopular now as the nonprint course was in library school. Sphere: Related Content