This project develops a semantic media adaptation scheme for mobile access of information that is perceptually optimized for users to enjoy semantically relevant media content using small display mobile devices. The research team explores seamless integration of multidisciplinary technologies from computer vision, media coding and transmission, wireless networking, mobile device, and human visual perception to tackle the challenges in closing the gap between rich content in high resolution and size limited mobile device access.

The intellectual merit of this project lies in the exploration and development of several relevant techniques in (1) Limited user interface media semantic extraction; (2) Capacity and resource constrained media content adaptation; (3) Perceptually optimized delivery and display of adapted media to small sized mobile devices. The research team addresses these issues by seamless integration of technologies from different research fields that traditionally have less interaction.

The project bridges both semantic gap and user intention gap in mobile multimedia search and access. First, the innovative scheme of semantic adaptation can be extended for any media search application based on semantically relevant characteristics. Second, the adaptation of high resolution media content for small sized mobile device plays a key role in media gateway for wireless mobile access. Finally, the investigation of perceptual optimized display on mobile devices shall open up a new research avenue to understand how mobile users perceive rich media content with small displays.

Project Report

The objective of the proposed research is to develop a platform for mobile devices to access the image and video database by means of wireless communication. Typical mobile devices can be smart phones that are available everywhere. One major barrier for developing such platform is to adaptively change the size of image and video to fit relatively small display size of smart phones. The most challenging technical issue is to find a region-of-interest from high definition images and videos and transmit the region for smart phone display that matches exactly the user request. The intellectual merit of this project lies in the exploration and development of several emerging techniques in (1) How the mobile search is executed based on limited user interface of smart phone; (2) How to work with the wireless networks that are changing constantly for information exchange; (3) How to guarantee that the images and videos the system finds for you matches well with users' request. We have answered these three important questions as stated above and have developed algorithms and software to perform these desired functions via simulated wireless networks and smart phones. We have gained significant insight into the challenges of these emerging research topics. The proposed research involves several disciplines of technology in image/video processing and analysis, wireless communication, mobile display, and human perception. The broader impact of this project reaches out to both educational training of both graduate and undergraduate students in computer programming and research methodology. The techniques developed in this project will have significant implications for next generation commercial mobile phones development that will be smart enough to search in the vast image and video database for any particular image or video that a user really wants. We expect to extend the technologies developed in this project to the social networking scenarios in which friends are sharing true media content among themselves via smart mobile devices.

Project Start
Project End
Budget Start
2010-04-01
Budget End
2011-03-31
Support Year
Fiscal Year
2009
Total Cost
$59,999
Indirect Cost
Name
Suny at Buffalo
Department
Type
DUNS #
City
Buffalo
State
NY
Country
United States
Zip Code
14260