Monday, December 19, 2005

SmartLibrary - Mobile Services in the Library

I can't believe I missed this one.

Back in May 2003, the Oulu University Main Library along with the Information Processing and Computer Engineering Laboratories of the Department of Electrical and Information Engineering launched a service called SmartLibrary. It allows library customers to browse the OULA library catalogue wirelessly with mobile terminal devices.

Access to a library's catalog on a mobile device is not that earth shattering in of itself. The interesting feature of this service is that if the customer so wishes they can request map-based guidance to a desired book or collection. The guidance system is based on dynamic user localization technology developed by Ekahau Inc.

Another new feature is a content production tool meant for the library staff: it allows the clerks to define the locations of shelf classes within the library as well as so-called landmarks, such as a photocopier, group work room or the borrowing desk - basically any target that can be located within the library.

The purpose of the landmarks is to make it easier for the users to find the objects that they are looking for in the library and to locate themselves at the same time. Landmarks also guide users to other libraries of the organization. SmartLibrary is operable in devices with an Internet connection and a right kind of browser (HTML browser in a PDA, XHTML browser in a mobile phone).

* SmartLibrary link for desktop users:
http://www.kirjasto.oulu.fi/zoula/
* SmartLibrary link for PDA users:
http://www.kirjasto.oulu.fi/oulapda/
* SmartLibrary link for mobile phone users:
http://www.kirjasto.oulu.fi/oulamobile/

More Information:

Aittola M, Parhi P, Vieruaho M & Ojala T (2004) Comparison of mobile and fixed use of SmartLibrary. Proc. 6th International Conference on Human Computer Interaction with Mobile Devices and Services, Glasgow, Scotland, 383 - 387.


Aittola M, Ryhänen T & Ojala T (2003) SmartLibrary - Location-aware mobile library service. Proc. Fifth International Symposium on Human Computer Interaction with Mobile Devices and Services, Udine, Italy, 411 - 416.

Sphere: Related Content

Hacking Santa

There is an interesting web site created by a guy named Josh McCormick that has received a bit of holiday press.

If you have visited one of the large retailers whiel shopping for gifts this holiday season you may have noticed this 5-foot (1.5m) Animated Singing Santa. Out fo the box this Santa "dances" and sings such classics as "We Wish You a Merry Christmas" and "Jingle Bells."

Well, having time on his hands, the innovative Mr. McCormick figured out a way to hack into the Santa in order to reprogram it with his own dialog. For those who have the time and desire to hack your Santa He has posted the detailed instructions on his Animated Singing Santa Hack web site. Sphere: Related Content

Friday, December 16, 2005

High Definition (HD) Radio takes to the air

I first subscribed to the satellite radio service XM four years ago. The absence of radio commercials, all to talkative disc jockeys, and the ability to drive state-to-state without having to search for a decent channel were my primary motivations for subscribing for radio service. Other than listening to the Bob and Tom Show and local afternoon host John Corby I almost never listen terrestrial radio anymore. Even at work I stream XM radio feeds.

As the number of satellite radio subscribers pushes 10 million in early 2006, and Kagen Research projecting more than 46 million subscribers by over the next 10 years, one would wonder what terrestrial, or over the air radio, will need to compete.

Not to be outdone by satellite, HD Radio (TM) has emerged. HD Radio technology may mark the most significant advancement in radio broadcasting since the introduction of FM stereo. HD radio is what is called an in-band on-channel (IBOC) digital radio system created by iBiquity Digital Corporation for broadcasting via existing FM and AM radio stations. The HD Radio offers broadcasters and listeners radically upgraded audio quality, along with an on-demand interactive experience and compelling new wireless data services. The technology is designed for multicasting, so consumers can continue listening to the same local AM/FM stations but with the added benefits.

With HD Radio AM digital will have FM-like audio quality with all broadcasts providing static-free reception with the elimination of the familiar signal fades, static, hissing and pops. HD Radio also allows wireless data services to include On-Demand audio which includes the streaming of audio content providing more information on station programs, news, weather, and traffic. Other wireless data services include the display of artist and song information. The service is similar to the Radio Data Service (RDS), but HD Radio has the capability of delivering data at speeds of 1,000 to 2,000 bits per second compared to RDS's 100 bps.

As of December 2005 there are over several hundred HD radio stations on the air. HD Radio receivers are now coming to market with BMW being the first with their announcement of HD Radio being an option for their 2006 7-series models. Home listening equipment is currently available from several companies, in both a home tuner, and a table top models.

Of course, even with HD Radio you will still get the commercials, all to talkative disc jockeys and will have to hunt for radio stations in between cities and states. For long trips I will still have to print out a list of all the Bob and Toma affiliates and hunt for stations. Sphere: Related Content

Tuesday, December 06, 2005

What's a Swicki?

A swicki is a search tool that allows the creation of deep, focused searches on any topic. Swickis are a new type of search engine or search results aggregator developed at Eurekster.

Unlike other search engines, a specific community has control over swicki results and uses the collective wisdom of a community to improve the search results. This resulting search engine, or swicki, can then be published on any web site. The swicki presents search results of interest in, pulls in new relevant information as it is indexed, and organizes it in a customizable widget that can be placed on a web site or blog. A "buzz cloud" is used to constantly update to show the hot search terms in the community.

Here's how Eurekster describes it:

"Joining the personal Web publishing phenomenon on blogs, podcasts, wikis and websites is the swicki - a next-generation search engine that gives personal and small-business Web publishers the power to design and deliver results tailored to their community's specific interests. A blend between a search engine and a wiki."

So, swickis are like wikis in that they are created by one person, but then can be set to allow lots of others to get involved. Currently, swickis automatically and anonymously learn from the search patterns of the community of users. Future features will allow the community to actively change the search results by explicitly deleting and promoting results. Creators will be able to control collaboration so one of the following occurs:

  • Only the creator can actively and passively influence the swicki
  • Only the creator can actively influence the swicki and everyone's behavior passively influences the swicki
  • More than one person can actively influence the swicki,and everyone's behavior passively influences the swicki
  • Everyone can actively and passively influence the swicki

The creator populates the site with the most common URLs for the community as well as the most common "buzz cloud" terms. As searches are preformed, the most commonly searched terms become quick links in the buzz cloud list. Setting up a Swicki is simple. It took me less that 5 minutes to create Open Source Systems for Libraries.

For More Information

LibraryCrunch's Library 2.0 swicki. An excellent library example put together by Michael Casey.


Luther L. Gobbel Library
of Lambuth University Sphere: Related Content

SmartFox (Firefox Scholar)

Scholars and researchers using library online catalogs, collections, and documents often have toutilize a series of stand-alone applications to make citations, take notes, and create personal collections and bibliographies. SmartFox (a.k.a Firefox Scholar) is new tool that may help.

Smartfox is an open source tool being developed by the Center for History and New Media (CHNM) at George Mason University with funding from the Institute of Museum and Library Services (IMLS). Also referred to as a scholar's web browser, the goal of SmartFox is to enable users to grab a citation to a book, journal article, archival document, or object and store it in the browser. Researchers can then take notes on the reference, associate the reference to others, and organize any metadata and annotations. The information gathered by SmartFox and the researcher is stored on the scholar's computer and is fully searchable.

No time frame for product release is available on the project site. Sphere: Related Content

Friday, October 14, 2005

Are Journal Impact Factors being Misused ?

An intersting article by Richard Monastersky entitled "The Number That's Devouring Science" was published in the October 14th issue of the Chronicle of Higher Education. The article discusses how the journal impact factor, once a simple way to rank scientific journals, has become an unyielding yardstick for hiring, tenure, and grants. The problem, it seems, is that the impact factor was developed select the most important journals for a new citation index, and not as an article level evaluation tool.

The ISI Citation Index has become one of the most widely used citation tools in science and the social sciences and was conceived in 1955 and developed in the 1960s, primarily by Eugene Garfield. The concept was developed to help them select the most important journals for a new citation index they were working on. It didn't make sense for them to include only the journals that get the most citations because that would tend to eliminate smaller publications. He came up with the "impact factor," a grading system for journals, that could help him pick out the most important publications from the ranks of lesser titles. The imapct factor reflects the average number of citations per article for each journal.

Each year, the number by which science journals live and die is computed from a simple formula. To calculate the impact factor for journal X, Thomson ISI examines the 7,500 journals in its index to find the number of citations to X in the previous two years. That number becomes the numerator. The denominator is the number of original research articles and reviews published by X in the previous two years.

Impact factors caught on since they are an objective measure that serves many purposes. Librarians use them to decide which journals to purchase or cancel. Editors and publishers can gauge their progress relative to their competitors. Scientists can examine the numbers to see where their research papers are more likely to get the most attention.

However, the measurement is an average of all the papers in a journal over a year; it doesn't apply to any single paper, let alone to any author. According to Monastersky, a quarter of the articles in Nature in 2004 drew 89 percent of the citations to that journal, so a vast majority of the articles received far fewer than the average of 32 citations reflected in the most recent impact factor. Mr. Garfield and ISI routinely point out the problems of using impact factors for individual papers or people. According to the article Jim Pringle, vice president for development at Thomson Scientific which oversees ISI, responded that "It is a fallacy to think you can say anything about the citation pattern of an article from the citation pattern of a journal."

The pressure to publish in the highest impact journals in order to be successful in the tenure process and grant competitions has led researchers to compete more and more for the limited number of slots in those journals. Impact factors may also affect the kind of research is conducted. Top journals require that papers be topical so researchers may shift the kinds of questions they investigate to accommodate those high-impact journals. The question has become if impact ranking have begun to control scientists, rather than the other way around.

Monastersky also detailed that in 1997 the Journal of Applied Ecology cited its own one-year-old articles 30 times. By 2004 that number had grown to 91 citations, a 200-percent increase. Similar types of citations of the journal in other publications had increased by only 41 percent.

Steve Ormerod, executive editor from 2000 through 2004, wrote several editorials during his tenure that cited his own journal dozens of times. In 2002, for example, two of his commentaries cited 103 papers published in the journal during 2000 and 2001. Those two editorials alone raised his journal's 2002 impact factor by 20 percent. The self-citations at his publication had a measurable effect since the journal's impact factor jumped from 1.3 in 1997 to 3.3 in 2004, and its ranking within the discipline rose from 29th out of 86 journals to 16th out of 107.

For More Information

Garfield E. Citation indexes to science: a new dimension in documentation through association of ideas. Science. 1955;122:108–111.

Garfield E, Sher IH. Genetics Citation Index. Philadelphia, Pa: Institute for Scientific Information; July 1963.

Garfield, E. The Obliteration Phenomenon in Science -- and the Advantage of Being Obliterated!. Essays of an Information Scientist; 2(396-398). 1975.

Garfield E. Which medical journals have the greatest impact? Ann Intern Med. 1986;105:313–320 Sphere: Related Content

Monday, October 10, 2005

Jaume: The Robot "Librarian"

Jaume is the name for a "librarian" robot is being developed by the Robotic Intelligence Lab of Universitat Jaume I (UJI) in Castellón (Spain) by a research group managed by Prof. Angel P. del Pobil.

According to the project web site, the goal for Jaume is to search and retrieve a book requested by a customer. The operation starts when the user requests a book by its name or code, either through Internet or by voice. The robot is then in charge of locating the book in an ordinary library, extract it and take it to the user. The only initial information is the book code, written on a label which is read by the vision system. This general application integrates several inter-disciplinary skills like path planning, visual perception or multisensory-based grasping, all linked together by reasoning capabilities.

The robot consists of several systems including a camera that helps with the naviagation and book recognition systems. Optical Character Recognition (OCR) systems are used to read the labels in order identify the materials. A mechanical arm is then uyse to extract the book from the shelf.

Automated storage and retrieval systems (ASRS) are not new. The systems were first introduced in the early 1960s and since there have been thosands of different types developed. The Prior Health Sciences Library and The Ohio State Univeristy where I work actually was home to one of Sperry Rand Corpration's five "Randreivers." It was a mechanical book storage system that required looking up a call number, giving it to a reference clerk who, in turn, matched it with an accession number (an undifferentiated string of 10 or 12 numeric characters), which, in turn, was entered via keyboard to retrieve the desired object.

By 1989 all American systems of them were out of service as the result of problems with suppliers (rand abandon the system), unanticipated maintenance costs, crude equipment and primitive computer control, and ignorance of customer requirements.

Among the deficiencies of this system was the need for people
to enter such numbers without error. Something like 30% of the failures
to retrieve had simply to do with this human-hostile resource identifier
being mistyped. The use of scanners alone would have reduced this number of errors, but scanner technology took off after the systems went offline primarly due to mechanical issues.

The question is if which research projects like Jaume will turn into practical library technologies and which ones will land up as the next Randtrievers.

For More Information

John Kountz, "Automated Storage and Retrival (AS/R) Systems of the Past: Why Did They Fail?" Library Hi Tech 31, no. 3 (1990): 87.

Barbara VanBrimmer, Elizabeth Sawyers, and Eric Jayjohn, "The Randtriever: Its Use at the Ohio State University," Library Hi Tech 8, no.3 (1990): 71.

Loo, Jeffery. ASRS
(Automated Storage and Retrieval Systems)in Academic Libraries
2001.

Prats M., Ramos-Garijo R., Sanz P.J., del Pobil A.P. Autonomous Localization and Extraction of Books in a Library. Intelligent Autonomous Systems, edited by F. Groen et al., Amsterdam: IOS Press, 2004.(pp. 1138-1145)

Prats M., Ramos-Garijo R., Sanz PJ, Del Pobil A. P. Recent Progress in the UJI Librarian Robot. In Proc. of IEEE International Conference on Systems, Man & Cybernetics, 2004. Sphere: Related Content

Monday, October 03, 2005

ICML9: 9th International Congress on Medical Librarianship

Every five years, the world’s medical librarians gather together to discuss our profession, to hear and see the latest developments, and to exchange news and ideas. The 9th International Congress on Medical Librarianship was held September 20-23 2005 in the city of Salvador, state of Bahia, Brazil. The theme for the program was "Commitment to Equity" I presented a paper on a research project entitled docMD: document Mediated Delivery.

The conference was set in a very scenic area of the country. The people were friendly and the atmosphere was very laid back. During our 12 day stay in Brazil we were struck that we did not encounter one rude person. This included the airports! People that did not speak English, and there plenty of them, generally worked with us and we were able to get by, although some of our food orders we not quite as expected. Negotating sales from market vendors was also a challenge since reading Portuguese and hearing it are quite different. Spanish speaking people were generally able to follow Portuguese, but even they commented that Brazilians spoke fast for them.

Many attendees, however, commented that the conference planners were overzealous in their efforts to protect attendees. For example, at one night time event we were escorted in groups by police from our buses through the old town area to the event site. We were only allowed to leave in large groups, again with police escort. Perhaps they were simply trying to protect us from the various street vendors that were very agressive in their marketing approach.

Attendees became fearful that the city was very dangerous. Like any larger city (Salvador has a population of 2.5 million) there are certainly issues with personal safety. Many Brasilians I talked to afterwards indicated the use of escorts was extreme. While I appreciate the planners concern, this approach set the tone for both the city and the meeting. Some attendees did not want to venture outside their hotels accept in large groups.

The conference itself was plagued with logisitical issues, from the lack of translation services to transportation. Many of the issues may have resulted from the fact that most of the conference planning fell on the shoulders of one very overworked individual and her staff. I give her all the credit in the world for pulling the conference off.

About 3/4 of the paper presented were in Portuguese or Spanish. With no translation services available I was unable to understand what was being discussed almost all of the time. At the same time, 3/4 of the attendees were unable to understand what I said. While this may be typical of internation conferences, it was frustrating, especially after all the time and energy it took to get to the conference. Sphere: Related Content

Thursday, September 08, 2005

Google's Digital Library Project and Copyright

Google has decided to go in a new direction with their Print for Libraries program after the publishing community began voicing their objections to the project. Google announced on August 12, 2005 that it will temporarily stop digitizing in-copyright books from its library partners and will concentrate on public domain materials (any book published before 1923 or ever published by the U.S. government).

In May 2005, the Association of American University Presses challenged the Google Print for Libraries program as a major breach of copyright. Google says it will include copyrighted material but will give publishers a chance to request that their books not be included. But the opt-out offer does not address the belief that the entire program is built on a foundation of purposeful copyright violation.

As explained on the Google Print site, “Information for Publishers about the Library Project” “Our ultimate goal is to work with publishers and libraries to create a comprehensive, searchable, next-generation card catalog of all books in all languages that helps users discover new books and [helps] publishers find new readers.” The list of benefits for Google Print publishers remains very lengthy. It even refers to the value of market research data collected for book titles, which allows publishers to track the number of people who look at a book’s pages and/or click on the “Buy This Book” link, in order to decide whether the book might be worth reprinting.

Google's library is an ambitious project that began in December 2004 by scanning books that are not covered by copyright located at five libraries, including those at Harvard and Stanford universities. The texts are fed into the Google search engine. A Google search for "George Washington" will turn up not only Web pages about the president but books that mention him as well. The intent is to help users find books and not to read them online. Books are displayed a page at a time, which makes it difficult to read long passages.

Publishers say Google should ask permission to use their books, rather than requiring them to opt out. Free-speech advocates say Google shouldn't give publishers the choice of opting out, because copying the books for searches is a fair use. Critics of the program said that Google is putting the weight of copyright protection on the copyright holders instead of the violators. Google's position is that what we are doing here is legal under the principles of fair use.

Google said it would go ahead with plans to digitize, and make searchable, works that are in the public domain, that is, those whose copyrights have expired. But in response to discussions with publishers, authors and others who hold copyrights, Google said it would wait until at least Nov. 1 before beginning to scan works that are still under copyright.

While Google's intent may be honorable, their intent is not to make a archival copy but instead, their plans are to create a copy that would be redistributed in a derivative format. While Google claims that the method of distribution would make it diffficult to read, that is not the issue. The distribution of copyrighted materials without permission from the copyright holder would appear to be a violation of copyright law.

Although Google may appear to violate the law by scanning without permission the entire copies of books protected by copyright, such an act is not illegal if it is considered “fair use” of the material. The most important issues for the courts would be the character of Google’s activity, its adverse economic impact on the copyright holder, and the amount of material it uses in proportion to the whole and if that is key to the work.

There is a legal precedent for any court battle between publishers and Google, and it favors Google. In the 2003 Kelly v. Arriba Soft Corp. case, a photographer sued a search engine claiming copyright infringement for displaying thumbnail images of work originally posted on his site. The Ninth Circuit found in favor of the search engine stationg that the act of copying the material, even though it was for commercial purposes, was not exploitative and therefore was fair use.

The question being asked is if the Ninth Circuit's ruling in Arriba applies to the Print Library Project. Google’s copies of books will not replace the originals, and the company does not profit from the sale of any books it scans, he wrote. Band does not represent any entity with respect to the Google Print project.

The debate is unlikely to be resolved soon. An injunction stopping Google from proceeding is a possibility. However, some legal experts feel that despite objections from publishers and writers copyright law appears to be on Google’s side. The social value of Google’s initiative to digitize library books, including those protected by copyright, will likely weigh heavily in the search engine’s favor.

Resources

Huen, Christopher T. Courts Unlikely to Stop Google Book Copying. Internet Week, September 2, 2005. Sphere: Related Content

Tuesday, August 23, 2005

RSS Feeds: A Useful Library Tool?

Known as really Simple Syndication or Rich Site Summary, RSS was created by Netscape in 1999 for use on the My Netscape portal. RSS is now used by the weblog community to share entries, full text, and even attached multimedia files. RSS is now used by many major news organizations to allow other websites to incorporate their "syndicated" headline or short-summary feeds. It is now common to find RSS feeds on many web sites.

A program known as a "feed reader" or "news aggregator" can check RSS-enabled webpages on behalf of a user and display any updated articles that it finds. Client-side readers and aggregators are typically constructed as standalone programs or extensions to existing programs like web browsers. Web-based feed readers and news aggregators require no software installation and make the RSS feeds available on any computer with web access. Some aggregators combine RSS feeds into new feeds, e.g. take all college football related items from several feeds and provide a new college football RSS feed.

While different versions of ther RSS standard are available, RSS 2.0's support for enclosures has made it the effective standard for podcasting, and is the format supported for that use by iTunes and other software designewd to play podcasts.

RSS feeds are typically linked to with an orange rectangle with the letters ''XML'' or ''RSS'' .

Libraries can make use of RSS feeds to help custoemrs stay current with important library news and information. Content delivered by RSS could inlcude genaral announcements, technology changes, new colleciton materials, and employment opportunities. Subject specialistys and departmental libraries can use RSS feeds to update their primary users on issues focused on their topic areas.

The website RSS4Lib is a great tool for uncovering innovative uses for RSS in libraries.


Resources

Steven M. Cohen RSS For Non-Techie Librarians. Created June 3, 2002.
Andrew King Introduction to RSS. Created March 27, 2000; Revised April 14. 2003.
Firstgov.gov
Google's Directory of RSS feeds Sphere: Related Content

Wednesday, July 13, 2005

High Definition DVDs: a Beta vs. VHS Redux?

In the 1980's two competing video formats fought it out in American living rooms. Betamax, released by Sony in 1975, was arguably the better technology. In fact many musicians used the technology to create digital recording a decade before digital recording became commonplace. However VHS, released by JVC in 1976, allowed significantly longer recording times (the original Beta format was limited to one hour, VHS could record 6).

A new war is looming between two incompatible types of high-definition video discs scheduled to hit the market later this year. Hollywood studios have committed to releasing scores of high-definition DVD movies later this year. Two camps backing incompatible next-generation technologies, led respectively by Sony and Toshiba, have as yet failed to agree on a way to unify their products.

The two technologies are known as HD DVD and Blu-Ray. The core difference between the two formats lies with a single aspect of the disc — a thin layer of plastic that sits just above the metal surface on which data is written. An HD DVD disc calls for a 0.6 millimeter coating, while a Blu-ray disc requires 0.1 millimeters. Blu-ray's thinner coating is the secret behind the disc's higher capacity. Since the laser travels through a thinner layer of resin, it's able to focus more sharply and write 67 percent more data onto the disc itself.

Sony's Blu-ray gets its name from the blue laser that, in addition to other techniques, allows it to store substantially more data on the same sized disc than a DVD. One single-layer Blu-ray disc can hold about 25 GB or over two hours of HD video plus audio, and the dual-layer disc can hold approximately 50 GB. The have already demonstrated 200 GB eight-layer technology. Blu-ray is supported by Apple; Dell; Hewlett Packard; Hitachi; LG; Mitsubishi; Panasonic (Matsushita Electric); Pioneer; Samsung; Sharp; Sony; Twentieth Century Fox; and Walt Disney.

Toshiba's HD DVD has a single layer capacity of 15 GB and a dual-layer capacity of 30 GB. HD DVD media is less expensive to manufacture than Blu-ray Disc which require re-tooling of DVD disc production lines. HD DVD players can also rely on some of the same technology as conventional DVDs, making it easier to build players that can handle both generations of disc. HD DVD is supported by Canon; Fuji; NEC; Onkyo; Paramount; Ricoh; Sanyo; TEAC; Toshiba; Warner Home Video.

If history is any indicator of the future, Sony may have learned its lesson and Blu-ray's larger recording capacity will win out. Their plan to include a Blu-ray drive in its PlayStation 3 should give it a strong foothold.

Resources:

HD DVD Promotion Group
hddvd.org
blu-ray.com
Blu-ray Disc Association Sphere: Related Content

Wednesday, July 06, 2005

Bluetooth: Bringing Devices and Information Together

Bluetooth ® is a wireless standard developed by the Bluetooth Special Interest Group, an industry association of electronics manufacturers. This standard allows different types of wireless devices such as headsets, PCs, mobile phones, keyboards, handhelds, etc. to communicate with each other without the need for extensive setup by the user. While the technology has been slow to develop, more and more mobile phone and car manufacturers are including Bluetooth. So, the time for Bluetooth is here.

Bluetooth-enabled devices begin communicating immediately when two or more devices are within range of each other by periodically broadcasting inquiry messages. If there is a response the originator of the inquiry becomes the Master unit and the responder becomes the Slave. After this initial contact, the Master sends the Slave information about how they will communicate including the initial frequency and phase. This results in the creation of a series of ad hoc networks are called Piconets, with a collections of Piconets forming Scatternets. In each case, the connections are peer-to-peer.

Bluetooth technology uses the 2.45 GHz (gigahertz) frequency along with other wireless technologies such as 802.11 (Wi-Fi) and can coexist because they use vastly different standards. A Bluetooth device will never mistake Wi-Fi as a Bluetooth transmission. Bluetooth devices transmit a 1 mW (milliwatt) signal that travels about 33 feet. The technology employs "spread-spectrum frequency hopping" in which it regularly switches transmitting among 79 individual random frequencies. The switch happens 1,600 times per second, so it's improbable that two Bluetooth devices will be using the same frequency at the same time and allows different Bluetooth devices running in close proximity at the same time without interfering with each other.

Academic and public libraries could reap the benefits of a Bluetooth-enabled network. Librarians at the reference desk assisting patrons in their research could transmit the results directly to the customer's PDA, avoiding the need to print the documents. Information that is transmitted could include rights and allowable use info, provenance, indexing and cataloging info, as well as digital media content associated with that item. Customers connected to the library network could share information electroncially in ad hoc user groups.

An instructor could turn student Bluetooth devices into instant workstation and create an on the fly computer lab. A Bluetooth cell phone or PDA could be used as PC remote control for professional presentations. Bluetooth printers could be placed throughout the library for those who wish to obtain printouts from mobile devices without requiring network cables and eliminates the line-of-sight problems with using infrared printers. Visually impared customers could use a Bluetooth pen reader that scans and speaks text from printed materials into a headset.

The technology received its name from Harald Bluetooth, son of Denmark's first king, Gorm the Old. The Danish word for blue, blå, also meant dark and the words for man, mand, and tooth, tand, and sound much the same. At the time of his rule, somewhere between 940 and 980 AD, southern Sweden was part of Denmark. In southern Sweden is Lund, the city in which Ericsson developed the Bluetooth technology. According to Ericsson, "One of his skills was to make people talk to each other....," and hence the choice of Bluetooth.


Resources:

Mott Allen, Maryellen. Bluetooth Bites Information Retrieval. ONLINE, May 2001

Guscott, John. These Emerging Technologies Will Change Public Libraries Library Futures Quarterly, May 2001. Sphere: Related Content

Monday, June 27, 2005

Supreme Court Rules Against File-Swapping Firms

The Supreme Court has handed movie studios and record labels a victory against file-swapping, ruling that peer-to-peer companies could be held responsible for the copyright piracy on their networks.

In their ruling on Metro-Goldwyn-Mayer Studios Inc. et al. V. Grokster, LTD., et al. Justice David Souter, who wrote in the majority opinion, stated "We hold that one who distributes a device with the object of promoting its use to infringe copyright, as shown by clear expression or other affirmative steps taken to foster infringement, is liable for the resulting acts of infringement."

The problem with this ruling involves how to properly hold a company responsible for what users choose to do with its products. Should gun makers be held responsible for crimes committed with their firearms? Should lockpick makers be responsible for those who use their tools in break-ins? Should VCR and DVD manufacturers be responsible for illegal copying of copyrighted content?

The key word in the ruling and the one that will be intersting to see how it will be defined is "promoting." It would seem that software developers would be able to get around this ruling by creating marketing compaigns around the "rights" of copyright holders.

One fear is that this ruling also has the potential to rewrite the Supreme Court's 1984 Sony Betamax ruling that made VCRs legal to sell. The impact on other technologies such as DVD recorders and personal video recorders (aka "TIVO") is troubling. Sphere: Related Content

Wednesday, June 22, 2005

WiMAX: Broadband on the Move

Most mobile computing users know which of their local coffee shops, bookstores, and libraries have wireless access. Many have set up wireless networks within their homes. Now, imagine having high-speed connectivity at home, around town, and even driving down the highway.

WiMAX is a Internet connnectiveity protocol which is very similar to WiFi (wireless fidelity). WiFi is a short-range network that allows anyone with a laptop to access the Internet within 150 feet of a hot spot. WiMAX (Worldwide Interoperability for Microwave Access) is similar WiFi but has a potential range up to 30 miles, eventually allowing access the Internet at broadband speeds.

WiMAX is standard is known as 802.16 and being marketed by the WiMax Forum, a consortium of over 80 members. With about 20 percent of the US not within reach of cable or DSL, WiMAX is designed as a "third pipe" or a technology that can complete with cable and DSL services for broadband network connectivity.

WiMAX utilizes base stations resembling cellular towers which transmit signals to subscriber stations. With speeds from 5 to 10 Mbps the technology is also faster than cable or DSL and can it can offer a cost-effective alternative to those technologies. While the technology can extend to 30 miles, the practical application is 3-5 miles based on tower hight, antenna gain, and transmit power.

The initial version of the technology id meant as a fixed network access point, not mobile. In 2006 and 2007, portable WiMax will enter the market. Companies offering fixed-antenna WiMax service could offer portable service within their service footprints. Manufacturers will then begin to integrate WiMax into PC cards, laptops, and portable devices.

A rival standard 802.20, or Mobile-Fi, was designed for mobility so it can handle mobile communications in moving vehicles up to 155 MPH. However, WiMAX will hit the market first.

Resources:

Cohan, Alan. "WiMAX: The wireless net gets extreme" PC Magazine. July 13, 2004

Martin, James A. "What's next for wireless?" PC World. July 29, 2004

Miller, Matthew. "When, where, and WiMAX" EDN. May 24, 2004

WiMAX Networking News

WiMAX Forum Sphere: Related Content

Tuesday, June 21, 2005

Welcome

Welcome to my blogspace. My goal is to post content dealing with technology and how it can be used to manage knowledge and information. Sphere: Related Content