Please use this identifier to cite or link to this item: https://repository.sustech.edu/handle/123456789/27868
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMostafa, Fatima Mohamed Yassin-
dc.contributor.authorSupervisor, - Adel M. Alimi-
dc.date.accessioned2022-12-04T11:41:29Z-
dc.date.available2022-12-04T11:41:29Z-
dc.date.issued2022-08-01-
dc.identifier.citationMostafa, Fatima Mohamed Yassin.Development of Deep Neural Networks Framework for Travel User Interest Discovery from Visual Shared Data in Social Networks\Fatima Mohamed Yassin Mostafa;Adel M. Alimi.-Khartoum:Sudan University of Science & Technology,College of Computer Science and Information Technology,2022.-113p.:ill.;28cm.-Ph.D.en_US
dc.identifier.urihttps://repository.sustech.edu:8443/handle/123456789/27868-
dc.descriptionThesisen_US
dc.description.abstractSocial networks are virtual environments where users express their opinions, preferences and interests. All shared information in Social Networks like Facebook, Twitter, LinkedIn can be very useful to understand citizen’s interest. This study aims to discover the travel user’s interests through the shared social images. The most proposed works depend a lot on textual data to analyse social networks and avoid the visual data especially in Facebook, due to the lack or the limitation of available social images analysis systems. Our proposed system is a novel framework to detect travel user’s interests from their posted and shared images in Facebook network. The two research questions are (1) what the current models of image analysis are used to discover the user’s interests in social networks especially in Facebook, and (2), how may those models be improved or select the best of them to discover the user’s interests. In terms of methods, deep neural network approaches were used. First we proposed a comparison between approaches based on Feedforward learning of Convolutional Neural Network (CNN) architectures GoogleNet and VGG’19 trained on Places365 Dataset for visual object Recognition. Once objects are recognized in images, we proposed a Deep Ontology Travel User Interest System (DOTUIS) based decision system for travel users interest prediction, our approaches based on CNN, GoogleNet and VGG’19 architectures can facilitate the interest of travel topic make it easy to discover is the users interest in travel or not, the GoogleNet architecture leading to a better performance and improving the classification accuracy than VGG’19 and in order to evaluate our Deep Neural approach, we have constructed a new database of shared images in Sudanese and Tunisian Facebook accounts. Our approaches have shown a promising result on Sudanese Facebook accounts database. Second, we proposed Deep Fuzzy Ontology Travel User Interest System (DFOTUIS) based on CNN, GoogleNet and VGG’19 architectures, and our proposed system has shown a very impressive result for travel Sudanese user’s interest. Both proposed ontologies are tested and evaluated on collected Sudanese Database. The Deep fuzzy ontology system (DFOTUIS) performs well than the crisp inference system (DOTUIS). Based on the profile outputted from the deep fuzzy ontology we proposed an Intelligent Recommendation System for Travellers’ Preferences (IRSTP).en_US
dc.description.sponsorshipSudan University of Science & Technologyen_US
dc.language.isoenen_US
dc.publisherSudan University of Science & Technologyen_US
dc.subjectDeep Neural Networksen_US
dc.subjectFrameworken_US
dc.subjectTravel Useren_US
dc.titleDevelopment of Deep Neural Networks Framework for Travel User Interest Discovery from Visual Shared Data in Social Networks (Case Study Facebook, Sudan)en_US
dc.title.alternativeتطوير إطار عمل من الشبكات العصبية العميقة لاكتشاف اهتمامات رحلات المستخدمين من خلال البيانات المرئية في الشبكات الاجتماعيةen_US
dc.typeThesisen_US
Appears in Collections:PhD theses : Computer Science and Information Technology

Files in This Item:
File Description SizeFormat 
Development of Deep........pdfResearch4.24 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.