MBIB4230 Information Retrieval
This course deals with theories, methods and models for constructing, using and evaluating Automatic information retrieval systems. This includes input from linguistics, mathematics, statistics and information theory.
- statistic and semantic based methods for document description and retrieval
- automatic classification and categorization
- search behaviour and how to construct systems for real users
- new methods/mediums/arenas for information retrieval, such as image and multimedia retrieval, retrieval of multilingual material etc.
Required prerequisite knowledge
The course presupposes knowledge from the bachelor courses BIB3210/BIB3220/BIB3230.
After completion of the course, the student has
- advanced knowledge of the theoretical fundaments for a variety of models for automatic information retrieval, and how the models can be realized With various algorithms
- advanced knowledge of user oriented views on information retrieval, both cognitive and social views, and their consequences for user interface, relevance judgements and interactivity in the retrieval process
- advanced knowledge of linked data and other semantic tools used to structure and make data available, and how to utilize such data
- thoroughly knowledge of practical experiments for evaluating information retrieval systems and models
- advanced knowledge of computational linguistics for analyzing grammar and semantics, and how this can be used in automatic information retrieval system
After completion of the course, the student can
- participate in, and have practical experience with the development and implementing of user friendly information retrieval systems and modules
- evaluate such systems in order to obtain and use them
Lectures, tasks and seminars. This includes presentation of a term paper, made individually or in groups, for discussion. Teaching will be in English when there are foreign exchange students present.
The assessment has two parts:
A term paper and an individual six-hour school examination. The term paper can be made individually (ca. 15 pages) or in groups with 2-3 students (ca. 20 pages).
The two parts get separate marks. The term paper counts for 40 percent and the school examination counts for 60 percent of the total and final mark. The student must pass both parts to pass the course.
Students who have failed a regular examination are entitled to sit a new examination in the part(s) they have failed. If the term paper fails, the whole group need to sit the new examination.
Letter grading A-F. An internal and an external examiner make the assessment.
Examination support material
For examination under surveillance, syllabus texts and individual notes can be used.
Syllabus MBIB4230 Information Retrieval, spring 2018
Baeza-Yates, R., & Ribeiro-Neto, B. (2011). Modern information retrieval: The concepts and technology behind search (2nd ed.). New York: Addison-Wesley Professional. Syllabus : Ch. 1-2, 3 (pp. 57-81, 104-113, 124-130), 4 (pp. 131-143, 159-176), 5 (pp. 177-183, 185-202), 6 (pp. 203-238), 7 (pp. 255-274), 8 (pp. 281-294, 300-304), 9 (pp. 337-344 ), 11 (pp. 447-472, 477-514), 12 (pp. 515-526), 13 (pp. 545-548), 14, 16-17
Belew, R. K. (2008). Finding Out About: A Cognitive Perspective on Search Engine Technology and the WWW. Cambridge: Cambridge University Press. Syllabus: Ch. 1 and 2
Belkin, N.J., Oddy, R. & Brooks, H. (1982). ASK for information retrieval: part I. Background and theory. Journal of Documentation , 38(2), 61-71. https://doi.org/10.1108/eb026722
Borlund, P. (2000). Experimental components for the evaluation of interactive information retrieval systems. Journal of Documentation , 56(1), 71-90. https://doi.org/10.1108/EUM0000000007110
Borlund, P. (2003a). The concept of relevance in IR. Journal of the American Society for Information Science and Technology, 54(10), 913-925. https://doi.org/10.1002/asi.10286
Borlund, P. (2003b). The IIR evaluation model: A framework for evaluation of interactive information retrieval systems. Information Research, 8(3). Retrieved from http://informationr.net/ir/8-3/paper152.html
Borlund, P. (2016). A study of the use of simulated work task situations in interactive information retrieval evaluations: A meta-evaluation. Journal of Documentation , 72(3), 394-413. https://doi.org/10.1108/JD-06-2015-0068
Datta, R., Joshi, D., Li, J., & Wang, J. Z. (2008). Image retrieval: Ideas, influences, and trends of the new age. ACM Computing Surveys, 40 (2), 5:1–5:60. https://doi.org/10.1145/1348246.1348248 (Paragraph 3.2 and 3.3 excluded)
Ellis, D. (1996). Progress and problems in information retrieval . London: Library Association Publishing. Ch. 1: pp. 1-22.
Fellbaum, C. (1998). WordNet: An electronic lexical database . Cambridge, Mass: MIT Press. Syllabus: Ch. 1, 4, 12
Harman, D. (1993). Overview of the first TREC conference. In Proceedings of the 16th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 36–47). New York, NY: ACM. https://doi.org/10.1145/160688.160692
Ingwersen, P. (2000). Users in context. In: Agosti, M., Crestani, F. & Pasi, G., (eds.). Lectures on information retrieval . Third European Summer-School, ESSIR, Varenna, Italy, 2000. Heidelberg: Springer-Verlag, 157-178. Retrieved from http://link.springer.com/chapter/10.1007/3-540-45368-7_8#page-1
Ingwersen, Peter & Järvelin, Kalervo. The turn: Integration of information seeking and retrieval in context . Dordrecht: Springer, 2005. Syllabus: Ch. 2, 6
Lancaster, W.F. (1969). Medlars: report on the evaluation of its operating efficiency. American Documentation , 20, 119-142. https://doi.org/10.1002/asi.4630200204
Orio, N. (2006). Music retrieval: A tutorial and review. Foundations and Trends in Information Retrieval, 1 (1), 1–90. https://doi.org/10.1561/1500000002
Page, L., Brin, S., Motwani, R., & Winograd, T. (1998). The pagerank citation ranking: Bringing order to the web. 17 pp. Unpublished paper. Retrieved from http://ilpubs.stanford.edu:8090/422/1/1999-66.pdf
Robertson, S.E. & Hancock-Beaulieu, M.M. (1992). On the evaluation of IR systems. Information Processing & Management , 28(4), 457-466. https://doi.org/10.1016/0306-4573(92)90004-J
Ruthven, I., & Kelly, D. (Eds.). (2011). Interactive information seeking, behaviour and retrieval . London: Facet. Syllabus : Ch. 1-4, 8, 12-13
Saracevic, T. (2009). Information Science. In: Bates, M.J. & Maack, M.N. (eds.). Encyclopedia of Library and Information Science . New York : Taylor & Francis. 2570-2586. Retrieved from http://comminfo.rutgers.edu/~tefko/SaracevicInformationScienceELIS2009.pdf
Schamber, L., Eisenberg, M. B., & Nilan, M. S. (1990). A re-examination of relevance: Toward a dynamic, situational definition. Information Processing & Management , 26, 755-776. https://doi.org/10.1016/0306-4573(90)90050-C
(Literature list last updated: 01.12.2017)
About the courseInformasjonsgjenfinning Masterstudium i bibliotek- og informasjonsvitenskap 15 ECTS Spring og autumn Teaching will be in English when there are foreign exchange students present 2017-2018
Apply for admission to this course
You can apply for admission for this course outside the master programme. More information about the admission process is available here.Admission Application deadline 9. of January 2017.
The same admission requirements as for master i bibliotek- og informasjonsvitenskap, and average mark C from the bachelor`s degree.The 9. of January. Pilestredet Tution fees: 820 Nkr.
It consists of semester fee to The Student Welfare Organisation of Oslo and Akershus (SiO) and copy fee