Browsing Research from April 2016 by Publisher "ACM"
Now showing items 1-4 of 4
Bibliometric-enhanced information retrieval: 7th international BIR workshopThe Bibliometric-enhanced Information Retrieval (BIR) workshop series has started at ECIR in 2014 and serves as the annual gathering of IR researchers who address various information-related tasks on scientific corpora and bibliometrics. We welcome contributions elaborating on dedicated IR systems, as well as studies revealing original characteristics on how scientific knowledge is created, communicated, and used. This report presents all accepted papers at the 7th BIR workshop at ECIR 2018 in Grenoble, France.
Displacement error analysis of 6-DoF virtual realityVirtual view synthesis is a critical step in enabling Six-Degrees of Freedom (DoF) immersion experiences in Virtual Reality (VR). It comprises synthesis of virtual viewpoints for a user navigating the immersion environment, based on a small subset of captured viewpoints featuring texture and depth maps. We investigate the extreme values of the displacement error in view synthesis caused by depth map quantization, for a given 6DoF VR video dataset, particularly based on the camera settings, scene properties, and the depth map quantization error. We establish a linear relationship between the displacement error and the quantization error, scaled by the sine of the angle between the location of the object and the virtual view in the 3D scene, formed at the reference camera location. In the majority of cases the horizontal and vertical displacement errors induced at a pixel location of a reconstructed 360° viewpoint comprising the immersion environment are respectively proportional to 3/5 and 1/5 of the respective quantization error. Also, the distance between the reference view and the synthesized view severely increases the displacement error. Following these observations: displacement error values can be predicted for given pixel coordinates and quantization error, and this can serve as a first step towards modeling the relationship between the encoding rate of reference views and the quality of synthesized views.
Exploration of applying a theory-based user classification model to inform personalised content-based image retrieval system designTo better understand users and create more personalised search experiences, a number of user models have been developed, usually based on different theories or empirical data study. After developing the user models, it is important to effectively utilise them in the design, development and evaluation of search systems to improve users’ overall search experiences. However there is a lack of research has been done on the utilisation of the user models especially theory-based models, because of the challenges on the utilization methodologies when applying the model to different search systems. This paper explores and states how to apply an Information Foraging Theory (IFT) based user classification model called ISE to effectively identify user’s search characteristics and create user groups, based on an empirically-driven methodology for content-based image retrieval (CBIR) systems and how the preferences of different user types inform the personalized design of the CBIR systems.
Information foraging for enhancing implicit feedback in content-based image recommendationUser implicit feedback plays an important role in recommender systems. However, finding implicit features is a tedious task. This paper aims to identify users' preferences through implicit behavioural signals for image recommendation based on the Information Scent Model of Information Foraging Theory. In the first part, we hypothesise that the users' perception is improved with visual cues in the images as behavioural signals that provide users' information scent during information seeking. We designed a content-based image recommendation system to explore which image attributes (i.e., visual cues or bookmarks) help users find their desired image. We found that users prefer recommendations predicated by visual cues and therefore consider the visual cues as good information scent for their information seeking. In the second part, we investigated if visual cues in the images together with the images itself can be better perceived by the users than each of them on its own. We evaluated the information scent artifacts in image recommendation on the Pinterest image collection and the WikiArt dataset. We find our proposed image recommendation system supports the implicit signals through Information Foraging explanation of the information scent model.