The prevalence of mobile devices, especially mobile phones, has made mobile Internet technology such as mobile social networks, mobile reading, mobile commerce and mobile payment indispensable parts of people’s everyday life. In China, the total monthly time spent on mobile devices keeps growing, and people spent 8.7 minutes more on average every day than they did in 2016 (QuestMobile, 2018). Most of Chinese websites such as E-commerce platforms have three channels: desktop website, mobile website and mobile applications (apps). The third quarter of 2017 has witnessed the online shopping on mobile devices accounting for 81.4% of the total volume of Internet transactions, which suggests that mobile commerce (M-commerce) has become dominant in E-commerce (iResearch, 2017). Online consumers’ visiting behavior has been reshaped by the changes in screen size, interface, functionality, and context of use. It is important to understand the differences between user behaviors on mobile and desktop platforms. A comparison will provide useful insights into the design of online shopping platforms so that mobile and desktop devices can both play better roles in satisfying users and increasing sales and revenue.
2 Literature Review
2.1 Differences between Behaviors of Desktop and Mobile Users
The earliest related study can be traced back to 2000 when Albers found that desktop and mobile users demonstrated rather different information searching and browsing behavior (Albers & Kim, 2000). It was followed by a number of studies investigating how people used both types of devices in online searching and learning. They found significant inter-device differences in users’ searching and academic behaviors, performances, intentions and so on, mostly through laboratory-based experiments or log data analysis.
It has been found that desktop and mobile users differed greatly in terms of search systems used, types of information searched, and levels of user experience and satisfaction (Wu & Bi, 2016). The times and locations of searching, query types and lengths, and clicking patterns all differed among users using mobile phones, tablets and desktops (Song, Ma, Wang, & Wang, 2013). Desktop users spent more time on searching and constructed more queries while mobile users saved more result pages (Ong, Järvelin, Sanderson, & Scholer, 2017). In general, mobile users attached more importance to search system functionalities, and users’ searching behavior on high-end phones was closer to that on desktops (Kamvar & Baluja, 2006; Kamvar, Kellar, Patel, & Xu, 2009). In addition, advertisements embedded in search results attracted more attention from mobile users than from desktop users. If advertisements were displayed on desktop web pages, users would scan the pages more roughly than they did on pages without advertisements. But advertisements had little influence on mobile users’ browsing behavior (Djamasbi, Hall-Phillips, & Yang, 2013).
Online learning users also behaved differently when using different types of devices. Interesting results have been engendered from laboratory-based experiments. Learners prefer learning MOOCs on desktops because they encountered more difficulties in learning mobile MOOCs (Dalipi, Imran, Idrizi, & Aliu, 2017). In contrast, users using tablets showed stronger willingness to learn than those using desktops (Sung & Mayer, 2013). Differences were found between desktop and mobile searchers, in the type of the first clicks and the number of subsequent clicks. However, their page navigation behavior was similar (Wu, Jin, & Wang, 2016). A search log analysis of library users’ searching behavior indicated that mobile users were more likely to use repeated queries in query reformulation, than both tablet and desktop users. But the inter-device differences in query reformulation became less and less obvious as the reformulation proceeded. The study suggested that mobile libraries should avoid such ineffective repetition and provide more useful guidance on the interfaces. (Wu & Bi, 2017a, 2017b).
Apple’s iOS and Google’s Android are the two most popular operating systems on mobile devices. They have different hardware/operating system requirements, software development tools, frameworks, languages, and documentation as well as instructor resources (Bergvall‐ Kareborn, Bjorn, & Chincholle, 2011; Liu, Li, Guo, Shen, & Chen, 2013; Papapanagiotou, Nahum, & Pappas, 2012). Such differences should have an impact on the user behavior and experience they engender, which however has not been investigated with scientific methods.
2.2 Clickstream Data Analysis
Clickstream data is generated on the Web server when users interact with a site or an app for their real-world purposes. Once the user clicks on a link or a button, a record is added to the transaction log. A clickstream is the sequence of a series of clicks that record the user’s interaction with the website or the app. Due to its unobtrusiveness and inexpensiveness (Jansen, 2009), clickstream data analysis has been widely applied in the contexts of E-commerce, social media, social commerce, E-learning, Internet portal and search engine to characterize and model user behaviors as well as to cluster similar users or visits.
Clickstream data has been adopted to cluster online shoppers, model online browsing and purchasing behaviors, and analyze off-site visit behavior and cross-site visit behavior. Based on such metrics as page requests, visit duration, visit pathway, and visit frequency, users were clustered into browsers, searchers and purchasers (Hofgesang & Kowalczyk, 2005; Moe, 2003). These metrics were also used to create the model of browsing behavior (Bucklin & Sismeiro, 2003) and the model of purchasing behavior (Montgomery, Li, Srinivasan, & Liechty, 2004; SismeiroC. & BucklinR.E, 2004). Further clickstream data analysis indicated that users’ purchasing behavior is related to users’ off-site and cross-site visit behaviors (Y. H. Park & Fader, 2004; Schellong, Kemper, & Brettel, 2016).
Similarly, social media research used clickstream data to analyze the characteristics of user behavior and thereafter produce user clusters. Specifically, there are two types of user behaviors on social media: visible interactions (e.g. commenting and recommending) and “silent” interactions (e.g. browsing a profile page and viewing a photo). The session durations present the long tail distribution (Benevenuto, Rodrigues, Cha, & Almeida, 2009). Most social media users were found to be female and they used social media for specific purposes such as posting (Chiang & Yang, 2015). Based on the metrics of visit duration and visit pathway, social media users could be clustered into different groups. Linden (2016) created 6 clusters: goal-oriented browser, editorial-content reader, recreational browser, commercially-oriented browser, active contributor and non-returning user. The six clusters in Banerjee and Ghosh’s (2001)study include users interested in contests, users who glance through the article, users who spend time in authors and articles, users interested in men-women relationships, users who read the articles, users interested in philosophy.
Moreover, clickstream data has been applied in E-learning to improve the video lectures (Sinha, Jermann, Li, & Dillenbourg, 2014), as well as to detect and predict students’ activities and performance so as to help instructors manage courses more effectively (Brinton & Chiang, 2015; J. Park, Denaro, Rodriguez, Smyth, & Warschauer, 2017). Analysis of the clickstream data of Internet portals indicated the factors influencing users’ choice of Internet portals (Goldfarb, 2001) and provided suggestions to design websites (Jiang, 2014). As for search engines, clickstream data could help improve the search result ranking (Kou & Lou, 2012) and the measurement accuracy of user experience (Sadagopan & Li, 2008).
The comparative analysis of users’ visiting behavior on different devices as well as on different mobile channels is beneficial to understand the characteristics of users’ behaviors in various environments and to help improve the interface design according to their advantages and disadvantages. This study introduced clickstream data analysis to the investigation of desktop and mobile users’ behavioral differences in online shopping. Although it is common for researchers to conduct clickstream data analysis based on such metrics as visits or visitors, visiting duration, visit path, conversion rate, and so on, the existing analysis is basically limited to the clickstream data from desktop sites.
Therefore, this study introduced clickstream data analysis to explore the differences of users’ visiting behavior between desktop and mobile devices, as well as the differences between the behaviors among various channels within mobile devices.
The server log collected for this study was provided by Fengqu (http://fengqu.com) a Chinese cross-border E-commerce company, and it contained the clickstream records of both desktop and mobile users, which allows an analysis of their visiting behavior in a comparative manner. To be specific, the comparison was performed to address the following research questions:
- (1) Are desktop and mobile users different in terms of the type of pages they viewed? If yes, what are the differences?
- (2) Are desktop and mobile users different in terms of the pattern of interacting with products? If yes, what are the differences?
- (3) Are desktop and mobile users different in terms of the time they spent on pages? If yes, what are the differences?
3.1 Research Setting
One of the largest cross-border E-commerce companies in China, Fengqu has attracted >3 million users in total, with 400,000 daily users. As a result of its operation of multiple channels, including desktop site, mobile site, and mobile app, it provides users with convenient global shopping services by selling a large variety of products under the departments of baby, beauty, food, health, electronics, fashion, home and kitchen, and so on.
The three channels, though targeting users on different devices, demonstrate a similar architecture which consists of six major page categories: (1) navigation (N): product categories and promotion information; (2) product (P): product details and user comments; (3) transaction (T): shopping cart, order submission, and payment; (4) account (A): personal information management, e.g. shopping history, favorites, registration, and login; (5) help (H): instructions of shopping with Fengqu; and (6) utility (U) additional functions such as scanning of quick response (QR) code, sending verification code, and checking notifications. And an online community (C), where users can share shopping-related experiences, is unique to the mobile application. A data coding system was created to identify each page category according to either the description part of the associated uniform resource locators (URLs) or the URL typed into the browser (Appendix).
3.2 Data Collection, Processing, and Analysis
The log file obtained from Fengqu contains 2,827,449 clickstream records generated on the server over a 4-day period, from 00:00:00 hours of Oct 1, 2016 to 23:59:59 hours of Oct 4, 2016. There are six major fields in the file (Figure 1): (1) sessionid: identifies a visit; (2) traceid: identifies a user; (3) access_time: indicates the start time of a visit; (4) access_end_time: indicates the end time of a visit; (5) appid: identifies the type of device; and (6) page: identifies the URL of the page requested.
Although this log file had been pre-cleaned by Fengqu technicians before provision, there still existed certain corrupted or redundant records. Further data cleaning was thus performed in Python 3.6 to eliminate (1) incorrect URLs: the page field is empty, or it contains one word or an external link; and (2) invalid device codes: the appid field contains a code other than 1 (desktop site)?3 (mobile site)?4 (iOS mobile app) or 5 (Android mobile app). They only accounted for 0.3% of all the records in the log file. It should be mentioned that no nonhuman records were detected due to the anti-spider technologies used by Fengqu. The cleaned log file includes 2,818,788 records, with 517,582 sessions, 181,112 distinct users, and 35,679 distinct URLs.
Next, this study adopted the general framework of clickstream data analysis established by Jiang (2010) to conduct the analysis. According to this framework, each click causes a footprint, a mark showing one’s presence on a page, and leaves a movement, the changing of location from one page to another, and a pathway takes shape after a series of clicks, composed of all the movements arranged according to the time they occur. The framework has been successfully applied in the clickstream data analysis of users’ information behavior in social tagging systems (Jiang, 2014) and academic library online public access catalog (OPAC) systems (Jiang, Chi, & Gao, 2017). In order to answer the research questions posed earlier, this study focused on the footprint analysis and investigated side-by-side desktop and mobile users’ footprint distribution, core footprint analysis and footprint depth analysis.
4.1 Footprint Distribution Analysis
The user produces a footprint when opening a page. One type of footprint matches a type of page. Fengqu users created a total of 2,818,788 footprints during the 4 days of the study. First, this study analyzed the footprint distribution to illustrate users’ visiting behaviors. Table 1 shows their distribution among the seven page categories. As is shown in the table, mobile users’ footprints are far more than those of desktop users. Totally, navigation pages attracted the most footprints (38.63%) whereas help pages yielded the least (.03%). This means that finding a product is the main online shopping activity while asking for help hardly appears since people already know E-commerce websites well.
The footprint distribution differs among devices as seen in Table 1. As far as desktop users are concerned, more than half of their footprints were left on product pages (53.95%), followed by navigation pages (37%). In contrast, mobile users’ footprints were most frequently seen on navigation pages (38.66%) and product pages only occupy 13.82%, even falling behind utility pages (21.81%) and account pages (16.07%). This distribution distinction between the two platforms suggests that people mainly use the desktop site to acquire product details while use the mobile devices to find products. Notably, both mobile (8.62%) and desktop (5.72%) transaction pages make up little of their overall footprints. This may be caused by two reasons: first, users may browse many product detail pages (PDPs) and finally decide to pay for one product; second, a considerable number of people entering online shopping websites do so just for entertainment rather than for buying a product. In other words, they do not have explicit shopping goals and may browse many products just for fun.
Footprint Distribution among Devices
|Page category||All Quantity||%||Desktop Quantity||%||Mobile Quantity||%|
Next, this study analyzed footprint distribution on the iOS app, the Android app and the mobile website, the three different channels of mobile devices. As shown in Table 2, the footprint distributions of iOS and Android apps are similar. The sequences of their page types based on the corresponding quantities are similar, i.e., navigation, utility, account/product, transition, community, and help pages (form large to small). Regarding the mobile website, it is extremely distinct from the mobile apps, while being the same as desktop website. Specifically, product pages (61.36%) and utility pages (0.24%) respectively, take the largest and smallest parts of all the mobile website’s footprints. However, utility pages of the two mobile apps both account for quite a large part (iOS 18.99%, Android 27.94%). The different distributions demonstrate that mobile website focus on providing information while mobile apps emphasize utility and interaction.
Footprint Distribution among Channels
|Page cate-gory||iOS App Quantity||%||Android App Quantity||%||Mobile website Quantity||%|
4.2 Core Footprint Analysis
Different types of pages take different roles during users’ visiting process. Some pages are designed for achieving users’ goals such as acquiring information and completing the task while other pages are aimed to help users arrive at target pages. In this research, users’ target is browsing or buying products. The PDP, which displays the textual and visual descriptions about a product, has crucial influence on the conversion rate (Gurley, 2000). This study focused on the viewing of this subcategory of pages, and the results of analysis can be seen in Table 3.
Results of PDP Viewing Analysis among Devices
|Total frequency of PDP viewing||24,376||353,743|
|Percentage of PDP viewing||53.95%||12.75%|
|Number of distinct users who have viewed products||20,621||68,736|
|Number of distinct products that have been viewed||10,119||18,959|
|Average viewing frequency of a user||1.18||5.14|
|Average viewing frequency of a product||2.41||18.66|
Overall, mobile devices accommodated much more vibrant PDP viewing activities than desktop devices. The total frequency of PDP viewing of mobile users is 14.5 times as high as that of desktop users (353,743/24,376). However, in the perspective of proportion, product viewing is obviously more important to desktop users (53.95%) than mobile users (12.75%).
A user may view multiple products, and a product may be viewed by multiple users. Though the PDP viewing on mobile devices involved more distinct users and distinct products, the distinction between the two devices respectively narrows to 3.3 times (68,736/20,621) and 1.9 times (18,959/10,119). Correspondingly, on average, one user views more products, and one product is viewed more on mobile platforms than on desktop platforms. Mann-Whitney U-test was adopted to examine the mean of two independent populations, which is a nonparametric complement of the paired t-test and widely used when the sample data is not normally distributed. Desktop and mobile viewing show significant difference (p < 0.001), i.e., viewing frequency of a user (mean rank: mobile 35,587.77 > desktop 27,147.37) and viewing frequency of a product on mobile device (mean rank: mobile 17,226.5>desktop 9,505.13) are both significantly higher than those on desktop. Thus, Fengqu’s mobile channels have achieved much higher effectiveness in reaching possible customers and in generating publicity for products.
The distributions of PDP viewing occurrences among users and products on desktop and mobile devices were further explored with log-log scale plots (Figure 2-5). As shown in the figures, the power-law distribution shows obviously on mobile devices. In other words, the majority of mobile users form the long tail. They only viewed one or two products during their whole visiting process. On the other hand, there were a few active users viewing a large quantity of products (Figure 4). Most products are ignored by mobile users; however, a few popular products still attracted extensive attention (Figure 5). Nevertheless, the distribution of the user behavior for the desktop is more special. The top 100 users viewed approximately 150 products, and the number then sharply decreased to 25 products (Figure 2). Moreover, the viewing frequency of one product on the desktop displays a stepped distribution (Figure 3), i.e., the products at one level are viewed similar times and thus can be grouped together.
4.3 Footprint Depth Analysis
Footprint depth is the viewing duration of one page. The longer time the user spends on the page, the more attention he or she pays to. The contents Table 4 displays the average page viewing durations among devices.
Average Page Viewing Durations among Devices (in seconds)
Overall, the footprint depths of different pages are diverse. Users spent the longest time on help pages (43.36s), followed by product pages (26.36s) and navigation pages (22.24s), and they spent the shortest time on utility pages (16.25s). This distribution is consistent with the features of each category of pages. Specifically, information-oriented pages require people to spend a great deal of time browsing the content, while function-oriented pages just require people to perform an option, just consuming little time. As for mobile devices, the footprint depth distribution is the same as the general distribution. In addition, community pages, which are unique to mobile devices, have the third longest duration (23.90s) as information-oriented pages, following help pages (36.59s) and product pages (26.71s). Surprisingly, in addition to help pages (61.99s), users left deep footprints on utility (49.66s), navigation (48.31s) and transaction (34.41s) pages, while product pages were left behind (20.89s). This unusual distribution might be caused by the design of the desktop website.
Totally, the average page-viewing duration of desktop users (32.14s) is longer than that of mobile users (20.57s). Especially, the desktop users spend approximately twice as much time as the mobile user on navigation, transaction, help and utility pages. As the mean values might be affected by extreme long durations, the Mann-Whitney U-test was again chosen to examine the inter-device differences for not requiring normal distribution. Significant differences in duration were found for the navigation, transaction, help and utility pages (p < 0.05). Desktop users spent a significantly longer time than mobile users on navigation pages (mean ranks: desktop 694,634.82 > mobile 496,449.15), transaction pages (mean ranks: desktop 152,669.95 > mobile 120,566.12) and utility pages (mean ranks: desktop 448.72> mobile 333.90, but a significantly shorter time on help pages (mean ranks: desktop 333.90 < mobile 448.72). The mean rank for the help pages was opposite to the corresponding analysis in Table 4, which is possibly caused by the extreme long duration of desktop users.
A further analysis was carried out on the footprint depth among mobile channels. Table 5 illustrates that the iOS app and the Android app have similar footprint depth distributions on the most types of pages, and that the former is slightly shallower than the latter. However, a significant distinction appears for utility pages, 3.77s on iOS app whereas 32.53s on Android app. The sequences of the two types of apps are similar (except for utility pages) (iOS H >P >C >N >A >T; Android H >P >C >N >T >A). Regarding the mobile website, it is completely different from the mobile apps. Specifically, the top three durations are on navigation (37.87s), transaction (29.12s) and account pages (25.09s), which again demonstrates a similar distribution as the desktop website.
Average Page Viewing Durations among Channels (in seconds)
|Page category||iOS||Android||Mobile website|
This study analyzed the clickstream data from the popular website Fengqu at the footprint level to compare mobile and desktop users’ visiting behaviors, and the results showed that they were very different in terms of footprint distribution, footprint depth and core footprint distribution.
On the whole, Fengqu provides service through mobile apps instead of through desktop or mobile websites, the latter only occupying 3% of all the footprints. This is in line with the Chinese Internet development. For the whole set of Internet users, the proportion of mobile users has increased from 95.1% in 2016 to 97.5% in 2017, and accordingly, the frequency of use of desktops, laptops and tablets has decreased (CNNIC, 2018). This is principally attributed to the mobile phone’s portability, which allows individuals to use it whenever and wherever they like.
Footprint comparison analysis indicates that people’s viewing behaviors on different devices are distinct: Users mainly view products through desktops, including browsing product details and providing comments. However, people primarily use mobile devices to discover products, know what products the platform displays. The behavior distinction to some degree comes from the thinking style: desktop users prefer a rational thinking style based on logic and judgement while mobile users tend to have an experimental thinking style based on instinct and emotions (Zhu & Meyer, 2017). The product details on desktop can be displayed more clearly and completely owing to the desktop’s big screen, and thus lightening user’s visual burden. Obviously, individuals more likely to achieve rational decision on desktop site if they have explicit purchasing intention.
Contrarily, people focus on enjoyment rather than utilization when using mobile devices. The usage of mobile devices is not just limited to finishing tasks but extended to satisfying individuals’ interests, making friends and even killing time. The community, as a unique function for mobile platforms also confirms this viewpoint. Core footprint analysis indicated that although product viewing activity took a larger proportion, mobile users performed more efficiently: first, users browsed more products, examining the earlier-stated perspective that people tend to explore information through mobile devices. Second, products are viewed more frequently, because the products on mobile devices are showed on various pages such as account pages and community pages. Addition to seeking products on navigation pages, users may encounter them on account and community pages and then be materially or emotionally motivated to view product details.
In the aspect of footprint depth analysis, users totally spent longer time on desktop than on mobile devices. The activity duration is a crucial factor influencing page viewing duration. Specifically, mobile activities are mainly included in navigation, utility and account pages. Navigation pages are responsible for directing individuals to target pages instead of holding on to them. Utility and account pages contain various short-term activities such as checking promotions, registration (get shopping credits with everyday first registration) and scanning QR code.
The product viewing activity is most common on desktop and requires individuals reading quantities of information, supposed to be time-consuming. Nevertheless, navigation, transaction and utility pages on the desktop, surprisingly, have deeper footprints than those on mobile devices. This may be caused by Fengqu’s information architecture. The mobile apps have separate “category” pages and classify the whole range of products into 11 categories. Furthermore, every category also has two-level and three-level sub-categories. Therefore, the comprehensive and organizational navigation with product pictures fits in people’s high visual bandwidth, so they can recognize and locate target products in seconds. In contrast, the desktop website has only six categories, and its navigation fails logic or lacks completeness. Consequently, users have to spend a great amount of time finding the target pages. Regarding the desktop transaction and utility pages, their payment and download activities are both implemented by scanning the QR code through mobile devices. In this situation, the use is extended to mobile devices and, therefore, users stay a longer time on the page.
Specially, this research paid attention on the comparison of three mobile channels. The analysis results indicated that the footprint distribution and footprint depth distribution on two kinds of apps are similar while the distributions on the mobile website are similar to those on the desktop website. The apps have been downloaded on the mobile devices, so they are convenient and stable. In addition, mobile devices’ extra functions, such as scanning QR code and pushing message notifications, provide kinds of interactions for exploring products. Specific to the two categories of apps, Android users totally spend a longer time than iOS users. This may be related to the slow loading speed of the Android system itself. Notably, their viewing durations on utility pages have a significant difference. The original clickstream data illustrated that Android users are trapped in the login pages for many seconds, which is possibly caused by the Android system’s inherent defect. Compared with the apps, mobile and desktop websites are visited through browser, so they have few interactions. They are better for users who do occasionally shopping, and shoppers mainly search for information through them.
This research collected a 4-day set of clickstream data, with 2,827,449 records, of an E-commerce website and focused on the footprint analysis based on the general framework of clickstream data analysis. This study includes footprint distribution analysis, core footprint analysis and footprint depth analysis. Overall, mobile apps have become the main channel for surfing the Internet and individuals perform more activities on apps. Furthermore, people prefer to explore and discover products through mobile apps while know product details through websites. The interactivity of apps improves both users’ involvement and products’ visibility. Moreover, designers attach more importance to development of apps and, therefore, correspondingly, users implement activities more efficiently. Additionally, the visiting behaviors of users for two types of apps show little difference, except in terms of the slow response speed.
Our study is, to a certain degree, limited by the sample size and methodology adopted. Clickstream data analysis can explicitly illustrate the behavioral data, but it fails to record implicit information explaining behavior. In addition, a 4-day span is somewhat short, and the result can be easily characterized by a certain period. So, future studies will expand the research data to demonstrate a longitudinal view of users’ information behavior.
This research was supported by the National Natural Science Foundation of China under Grant s No.71774125 and No. 71420107026.
Albers M. J. & Kim L. (2000 September). User web browsing characteristics using palm handhelds for information retrieval. In Proceedings of IEEE professional communication society international professional communication conference and Proceedings of the 18th annual ACM international conference on Computer documentation: technology & teamwork (pp. 125-135) Cambridge Massachusetts USA.
Banerjee A. & Ghosh J. (2001 April). Clickstream clustering using weighted longest common subsequences. In Proceedings of the web mining workshop at the 1st SIAM conference on data mining (Vol. 143 p. 144). San Diego California.
Bergvall-Kåreborn B. Björn M. & Chincholle D. (2011). Motivational profiles of toolkit users–iPhone and Android developers. International Journal of Technology Marketing 6(1) 36–56.
Brinton C. G. & Chiang M. (2015 April). MOOC performance prediction via clickstream data and social learning networks. In Computer Communications (INFOCOM) 2015 IEEE Conference on (pp. 2299-2307). Retrieved from http://www.3nightsdone.org
Bucklin R. E. & Sismeiro C. (2003). A model of web site browsing behavior estimated on clickstream data. JMR Journal of Marketing Research 40(3) 249–267.
Chiang I. P. & Yang S. Y. (2015 September). Exploring Users’ Information Behavior on Facebook Through Online and Mobile Devices. In International Conference on Multidisciplinary Social Networks Research (pp. 354-362). Springer Berlin Heidelberg.
CNNIC. (2018). The 41st China Statistical Report on Internet Developoment Retrieved from
Dalipi F. Imran A. S. Idrizi F. & Aliu H. (2017). An Analysis of Learner Experience with MOOCs in Mobile and Desktop Learning Environment. In J. Kantola T. Barath S. Nazir & T. Andre (Eds.) Advances in Human Factors Business Management Training and Education (pp. 393–402). Cham: Springer.
Djamasbi S. Hall-Phillips A. & Yang R. R. (August 2013). An examination of ads and viewing behavior: An eye tracking study on desktop and mobile devices. Proceeding of the Nineteenth Americas Conference on Information Systems Chicago Illinois
Goldfarb A. (2002). Analyzing website choice using clickstream data. In The Economics of the Internet and E-commerce (pp. 209-230). Retrieved from https://arxiv.org/pdf/cs/0110008.pdf
Gurley J. W. (2000). The one internet metric that really matters. Fortune 141(5) 392-392. Retrieved from https://elibrary.ru/item.asp?id=3794367
Hofgesang P. I. & Kowalczyk W. (2005). Analysing clickstream data: From anomaly detection to visitor profiling. Ecml/pkdd Discovery Challenge 2005 Retrieved from https://www.researchgate.net/publication/228722329_Analysing_clickstream_data_From_anomaly_detection_to_visitor_profiling
iResearch. (2017). 2017 China M-Commerce Third-Quarter Report Retrieved from http://www.iresearch.com.cn/Detail/report?id=3112&isfree=0
Jansen B. J. (2009). Understanding user-web interactions via web analytics. In Marchionini G.(Ed.) Synthesis Lectures on Information Concepts Retrieval and Services(pp1-102). Retrieved from https://doi.org/10.2200/S00191ED1V01Y200904ICR006
JiangT. (2011). Characterizing and evaluating users’ information seeking behavior in social tagging systems (Doctoral dissertation). Retrieved from http://dscholarship.pitt.edu/10412/1/Jiang_Tingting_etd2010.pdf
Jiang T. (2014). A clickstream data analysis of users’ information seeking modes in social tagging systems. In iConference 2014 Proceedings (p. 314–328) doi:10.9776/14091
Jiang T. Chi Y. & Gao H. (2017). A clickstream data analysis of Chinese academic library OPAC users’ information behavior. Library & Information Science Research 39(3) 213–223.
Kamvar M. & Baluja S. (2006 April). A large scale study of wireless search behavior: Google mobile search. In Proceedings of the SIGCHI conference on Human Factors in computing systems (pp. 701-709). Montréal Québec Canada.
Kamvar M. Kellar M. Patel R. & Xu Y. (2009 April). Computers and iphones and mobile phones oh my!: a logs-based comparison of search users on different devices. Paper presented at the International Conference on World Wide Web Madrid Spain
Kou G. & Lou C. (2012). Multiple factor hierarchical clustering algorithm for large scale web page and search engine clickstream data. Annals of Operations Research 197(1) 123–134
Lindén M. (2016). Path Analysis of Online Users Using Clickstream Data: Case Online Magazine Website (Master’s thesis). Retrieved from http://www.doria.fi/bitstream/handle/10024/120865/ProGradu_Linden_final.pdf?sequence=2&isAllowed=y
Liu Y. Li F. Guo L. Shen B. & Chen S. (2013 March). A comparative study of android and iOS for accessing internet streaming services. In International Conference on Passive and Active Network Measurement (pp. 104-114). Berlin Heidelberg
Moe W. W. (2003). Buying searching or browsing: Differentiating between online shoppers using in-store navigational clickstream. Journal of Consumer Psychology 13(1-2) 29–39.
Montgomery A. L. Li S. Srinivasan K. & Liechty J. C. (2004). Modeling online browsing and path analysis using clickstream data. Marketing Science23(4) 579–595.
Ong K. Järvelin K. Sanderson M. & Scholer F. (2017 August). Using information scent to understand mobile and desktop web search behavior. In Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 295-304). Shinjuku Tokyo Japan.
Papapanagiotou I. Nahum E. M. & Pappas V. (2012). Smartphones vs. laptops: Comparing web browsing behavior and the implications for caching. Performance Evaluation Review 40(1) 423–424.
Park J. Denaro K. Rodriguez F. Smyth P. & Warschauer M. (2017 March). Detecting changes in student behavior from clickstream data. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 21-30). Vancouver BC Canada.
QuestMobile. (2018). 2017 China Mobile Internet Report Retrieved from https://www.questmobile.com.cn/blog/en/blog_130.html
Sadagopan N. & Li J. (2008 April). Characterizing typical and atypical user sessions in clickstreams. In Proceedings of the 17th international conference on World Wide Web (pp. 885-894). Beijing China.
Schellong D. Kemper J. & Brettel M. (2016 June). Clickstream data as a source to uncover consumer shopping types in a large-scale online setting. Paper presented at ECIS (p. ResearchPaper1). İstanbul Turkey.
Sinha T. Jermann P. Li N. & Dillenbourg P. (2014). Your click decides your fate: Inferring information processing and attrition behavior from mooc video clickstream interactions. arXiv preprint arXiv:1407.7131. Retrieved from https://arxiv.org/pdf/1407.7131.pdf
Sismeiro C. & Bucklin R. E. (2004). Modeling purchase behavior at an e-commerce web site: A task-completion approach. JMR Journal of Marketing Research 41(3) 306–323.
Song Y. Ma H. Wang H. & Wang K. (2013 May). Exploring and exploiting user search behavior on mobile and tablet devices to improve search relevance. In Proceedings of the 22nd international conference on World Wide Web (pp. 1201-1212). Rio de Janeiro Brazil.
Sung E. & Mayer R. E. (2013). Online multimedia learning with mobile devices and desktop computers: An experimental test of Clark’s methods-not-media hypothesis. Computers in Human Behavior 29(3) 639–647.
Wu D. & Bi R. (2016). Mobile and Desktop Search Behaviors: A Comparative Study. New Technology of Library and Information Service 32(2) 1-8 Retrieved from http://en.cnki.com.cn/Article_en/CJFDTOTAL-XDTQ201602001.htm
Wu D. & Bi R. (2017a). Impact of device on search pattern transitions: A comparative study based on large-scale library OPAC log data. Electronic Library(3) 00-00
Wu D. & Bi R. (2017b). Query reformulation in accessing library OPAC: A comparative study on different devices.
Wu D. Jin X. & Wang L. (2016). A Comparative Study on the Subsequent Clicks of Mobile Library and Non-mobile Library. Library & Information Service(18) 27-34 doi: 10.13266/j.issn.0252-3116.2016.18.004
Zhu Y. & Meyer J. (2017). Getting in touch with your thinking style: How touchscreens influence purchase. Journal of Retailing and Consumer Services38 51–58. Retrieved from http://www.acrwebsite.org/volumes/1022449/volumes/v44/NA-44
Page Category Coding System
|Page category||Code||Action type||URL strings|
|Community||C1 C2||Theme article Discovery||/order/service-remindlogistics /apponly/ discoverthemecomment /apponly/ discovermytheme /apponly/segmentmall|
|/apponly/ segmentsearch /apponly/themesearch|