Public entities modernization is understood as the design of websites and data portals or the participation through social networks. This has caused a revolutionary change in the public sector enabling governments transparency and accountability (Jaeger and Bertot, 2010).
This paper aims to analyse how accessible the information is within the new Information and Communication Technologies’ (ICTs) media tools. The concept of accessibility and usability of a Government website can be measured by the time consumed and the ease of use by the stakeholders to find information about internal work, decision procedures, resource allocation and procedures. Contributing methodologically to earlier research, studies such as Pina et al. (2009) model to measure usability and Alcaraz Quilés et al. (2018) model to measure Global Reporting Initiative (GRI) items’ disclosure have been taken into account in order to analyse transparency in the South America Central Governments (SACGs). Additionally, in line with Coy and Dixon (2004), the accounting standard qualitative characteristics of government disclosure a report must comply with were analysed. From a practical point of view, only the usability and accessibility of the reports were tested.
The review of previous research has allowed to identify accountability as the concept that implies actors having obligations to act in ways that are consistent with accepted behaviour standards. Western countries have experienced a growing demand for accountability as a key element to the democratization of the states (Filgueiras, 2016), in others words, as a key to better governance (Hood and Heald, 2006), recognizing the importance of information for effective modern democratic governance (Stiglitz, 2002). Thus, mechanisms for appropriate accountability must be institutionalized (Grant and Keohane, 2005).
Numerous studies recognized that ‘website openness’ promotes government accountability (Wong and Welch, 2004). With the aid of information technologies (ITs), public administrations could adopt increasingly influential means of improving access to government services and information to anyone, anytime, anywhere, thus improving accountability (Caba Pérez et al., 2008) and creating a significantly higher public value (Scott and Meijer, 2016).
However, there are very few studies that measure the reporting of nations and the related accountability and openness at an international level and even fewer in the South America (SA) region. Furthermore, as far as the authors could find, there is no accountability index designed for the comprehensive measurement and comparison of the accountability of nations’ central governments’ reporting. Hence, this paper aims at contributing to fulfil this gap as it addresses countries’ principles of good governance in regard to the relationship with accountability and the use of the GRI, as some of these principles demonstrate strong commitment to ethical values, respecting the rule of law, ensuring openness and comprehensive stakeholder engagement. The present paper, therefore, derives a new model to measure accountability in countries, diagnosing the true level of accountability in a comprehensive manner and demonstrating the real progress or setbacks they have. This index will consider the information disclosure in terms the Global Reporting Initiatives (GRI) items in the public sector, as they are indications of the accountability level of the country in question, and apply it to the South American Central Governments. Our combined or global model facilitates stakeholders to observe, compare and analyse the information of SACGs to improve the accountability of management and access of information for citizens.
The remainder of this paper is organized as follows: first, a literature review about accountability and its implications is presented. Secondly, data and methodology are explained. Finally, main findings, discussion and conclusions are presented.
LITERATURE BACKGROUND ON ACCOUNTABILITY
The democratization of countries has been developed through an increase of governments’ accountability. Governments’ accountability and transparency are closely linked and can be defined as the openness of information about internal procedures, decision-making, policies and resource allocation towards citizens (Filgueiras, 2016; Fox, 2007).
The key to a better governance (Hood & Heald, 2006) recognizes the importance of information for effective governance democracies (Stiglitz, 2002) and the role of the international financial institutions in promoting fiscal transparency (Florini, 1999; Kopits & Craig, 1998).
The generally accepted definition of ‘accountability’ is that used by Lourenço (2015) based on those provided by Armstrong (2005) and Bovens (2007), for whom public accountability is meant as the obligation for public officials to report on the usage of public resources and answerability of government to the public to meet the stated performance objectives. The concept of accountability implies that the actors being held accountable have obligations to act in ways that are consistent with accepted standards of behaviour and that they will be sanctioned for failures to do so (Grant & Keohane, 2005).
As for ‘transparency’, Armstrong (2005) defines it as ‘unfettered access by the public to timely and reliable information on decisions and performance in the public sector’. There are concerns that led to focus on the limited research on accountability, a gap that need to be filled, to fully appreciate the context-specific mechanisms through which it is fulfilled in contexts, the fact the only limited research has engaged in systematic, comparative empirical research (Bovens, 2010).
The literature on the measurement of accountability is still scarce, even though it has been growing over the past decade. Some of the few examples are presented below.
Wong and Welch (2014) conduct an empirical study of the relation between accountability and e-government in fourteen countries of the five continents, namely Australia, Canada, China, Egypt, France, Germany, Indonesia, Japan, Korea, the Netherlands, New Zealand, Singapore, the United Kingdom, and the United States. This research applies regression analysis to study all the attributes of accountability (transparency, interactivity and openness) between 1997 and 2000, that is, the independent variable is accountability and the dependent variables are openness, transparency and interactivity.
Coy and Dixon (2004) craft a disclosure index with parametric statistical properties to increase accountability of public sector organisations, through improved reporting. The Public Accountability Index (PAI) is developed from a public accountability perspective using stakeholder opinions captured via a Delphi exercise, a method which relies on a panel of experts, and it is applied to measure the annual reports of the eight New Zealand universities in the period 1985–2000. The developed dimensions taken into account are qualitative characteristics such as relevance, timeliness, accessibility, overview report, overview university, service community, service general, service teaching, service research, teaching process, teaching output outcome, service research and financial, among others. The PAI is offered as a generic stakeholder approach for the development of disclosure indices to measure annual reports of groups of organisations in any of the private, public or so-called third sectors.
Another study of exploratory interviews (Salas Quirós, 2015) makes a proposal of an index of accountability in the centralized public sector management in Costa Rica. This index develops a total of six dimensions, four of internal control (the classic-legality economic-form, the management-economy effectiveness and efficiency, the organizational-structures, processes, staff, the judicial-court of the contentious administrative, constitutional tribunal) and two of external control (parliamentarian-politic, economic, citizen attention, external audit-administrative bodies, consulting) in two ministries of Costa Rica in order to contribute to increased transparency and efficiency in its functions, diagnosing the level of accountability. According to the results, there is a culture of little strengthened accountability.
Other current research that deals with the subject of accountability measurement are the International Federation of Accountants (IFAC), the Chartered Institute of Public Finance and Accountancy (CIPFA), with the Zurich University of Applied Science as a knowledge partner, that developed the International Public Sector Financial Accountability Index (IFAC, 2018). The index focuses on federal/central governments and considers two fundamental aspects: Accounting Basis, providing an accurate picture of the extent of accrual accounting and IPSAS adoption globally, and Financial Reporting Standards, focusing on the quality of financial accountability information.
The present study analyses government reporting and accounting issues using 75 questions from the Global Reporting Initiative (GRI) questionnaire (see Appendix A for a complete list of the items), as an indicator of the overall quality of the reporting and degree of transparency. The GRI is the most trusted and widely used indicator in the world helping businesses, governments and other organizations understand and communicate the impact of key aspects in sustainability reporting practices (IFAC, 2014, 2015, 2018).
Methodologically, this analysis has taken into account the previously mentioned modelling studies Alcaraz Quilés et al. (2014, 2018) to measure the GRI items’ disclosure, Wong and Welch (2004) regarding the attributes of accountability, Coy and Dixon (2004) from a public accountability perspective, and Salas Quirós (2015) in the dimension of internal control classic, management and organizational. All four models have been taken into account to develop an accountability index for governments that was tested in the 12 South American Central Governments. This combined or global Accountability Index might facilitate stakeholders to observe, compare and analyse the transparency information of SACGs to improve the accountability of management and access of information for citizens.
The term ‘click’ is the noise produced when a user presses on a button of his mouse, and by extension it is applied to the basic interaction that a user has with a computer system. The click shows how many steps users have to follow to surf form one point to another (Huberman & al., 1998; Milic-Frayling et al., 2004). As such, the click has become the unit of measure for traffic on websites, to measure their popularity and economic value.
In recent years, many studies were made to measure e-government. Most of them evaluated the efficiency of portals regarding the existing features such as the presence of designated information (Finger & Cotti, 2002; Ingram & Gray, 1998; Kerschot & Poté, 2002; Glover et al., 1999; West, 2002). Usability experts (Blackmon, Polson, Kitajima & Lewis, 2002; Kalbach, 2002; Ritter, 2002; Zeldman, 2001) insist on the importance of the availability of the information within a few clicks. Hence, empirical surveys on Web navigation use the click as a measurement value or accessibility norm (Huberman et al., 1998; Milic-Frayling, et al., 2004).
Beyond the click as functionality, the numbers of clicks and specifically the three-clicks rule are seen as a global way of designing, optimising and organizing websites. In other words, the three-clicks rule, stating that most people give up after three clicks, defines a tolerance threshold in supposed surfing habits of internet users (Bernard, 2002; Kalbach, 2002; Porter, 2003; Zaphiris, 2000; Zeldman, 2001). However, some authors have a more elastic interpretation and consider that the number of clicks is not so important as long as the users have the feeling they are going in the right direction. These authors believe that the quality of the navigation milestones is as important as the number of steps to follow. Particularly, Porter (2003) showed in his study that the three-click rule, is just a myth, and affirms, according to the data, that users often kept going for as many as 25 clicks.
In order to achieve the proposed objective, a cross-sectional study with non-experimental design was carried out to describe the relevance of the information transparency of the countries’ official portal websites. This study is concerned with the central governments of South America, and therefore, the population consists of 12 countries, namely Argentina, Bolivia, Brazil, Chile, Colombia, Ecuador, Guyana, Paraguay, Peru, Suriname, Uruguay and Venezuela. The study has used the 75 items of the GRI questionnaire, following Alcaraz Quilés et al. (2014, 2018). A complete list of the items can be found in Appendix A.
The SA governments’ official portal websites are quite similar in its structure. However, there are some differences worth noticing. In particular, the design and services offered by Uruguay, Chile and Argentina are very attractive and user friendly while Ecuador, Bolivia and Venezuela web pages are the opposite, not very attractive and not very user friendly. Neither of them has different languages access (only Spanish for most, except for Portuguese in the case of Brazil, English in Guyana and Dutch in Suriname), nor windows and dynamic information with images and videos, except for Uruguay. They do offer an institutional email for citizens to write about any issue and a map to help citizens find their way easily. All countries use their social networks frequently except Suriname and Guyana. Only Argentina, Paraguay and Uruguay have technical information about the portal in the footer. It is interesting to highlight that Argentina has a vertical surfing menu with different types of usable information citizens can consult and access, together with a possibility to make some administrative formalities through the web. Finally, Bolivia and Venezuela websites do not have a search engine.
In order to process the search, an analyst examined each Government official website portal, then accessed the database and counted the number of clicks to access the information. Please find a list of the 12 SA countries’ official portal websites, used as a starting point for the click count, in Appendix B. Data collection took place in the period of four months, from May to September of 2018. The information ‘not accessed’ was coded as more than 20 clicks. A click was considered to be each press of the button of the mouse in a new hyperlink; each home search for information (new item) is part of the official main page.
Given any data set, it is often essential to identify its dimensional structure correctly, since this is the basis of statistical analysis of the data. Several statistical methods are available to reduce the complexity of the data and attempt to identify the statistical dimensional structure of the data, specifically to identify the number of (dominant) dimensions: factor analysis, cluster analysis, and multidimensional scaling. Multidimensional scaling is a technique for the analysis of similarity or dissimilarity among a set of objects. Cluster analysis attempts to discover natural groupings of objects. Factor analysis purpose is to describe and explain the correlation among a large set of variables in terms of a small number of underlying dimensions. Factor analysis is, therefore, the most adjustable technique to build the proposed accountability index.
However, in the present research, the number of variables (GRI items) outnumbers greatly the number of subjects (SA countries) so that factor analysis should not be used to build the accountability index (MacCallum, Widaman, Zhang, & Hong, 1999). Clustering items is an alternative to factor analysis (Borgen & Barnett, 1987; Roussos, Stout, & Marden, 1998; Zhang, 2013; Zhang & Stout, 1999) based upon a much simpler model. The i-clust algorithm (Revelle, 1978) was used in the present study. This algorithm is meant to do a hierarchical item cluster analysis by using product-moment correlation as a measure of similarity between items and generating a bottom-up solution that forms composite scales by grouping items. Clusters are formed until either reliability coefficient α Cronbach (1951) or α Revelle (1979) fails to increase. Revelle’s Beta coefficient is one of the reliability measures proposed as a complement to Cronbach alpha usage critiques and cautions for judging the internal consistency of summated scales, like the present case. For a complete review of several different reliability measures and their behaviour please see Sijtsma (2009) and Revelle, & Zinbarg (2009).
The i-clust approach, compared to traditional factor or principal component analysis in scale development, is psychometrically a more coherent method as it stops clustering when the internal consistency estimates (either the alpha or beta coefficients) fail to increase (Cooksey, & Soutar, 2006).
Unless otherwise noted, all the calculations and graphs in the following were done using R v.3.5.2, package ‘psych’ (Revelle, 2017).
The descriptive statistical results show (see Table 1) that the number of clicks necessary to obtain the information about the 75 GRI standard items is lower in Uruguay (Mean = 8.93, Median = 7), Chile (Mean = 9.78, Median = 7), Argentina (Mean = 10.97, Median = 8.5), and Colombia (Mean = 11.69, Median = 8), while it is really high in Venezuela (Mean = 17.26, Median = 20). Besides all 12 countries’ standard deviations show a great variability in the needed number of clicks to access the items.
Descriptive statistics by country
Appendix C presents the descriptive statistics by item. It has to be noted that there are some items that seems to be more difficult to find, if they are at all found, throughout all the South American countries, namely IT-14 (Has the central government been awarded prizes or other recognition during the period in question?), IT-32 (Tax pressure), IT-74 (Is information published on the disposal of waste water by the community?), IT-36 (Are the services costs disclosed?), IT-4 (Does this statement include events, achievements and failures during the period in question?), with respective mean values 18.50, 18.08, 17.83, 17.42, 17.17, and median in all five cases 20. That means that in most SA countries, information about those five items is unattainable. However, there are a few GRI items that can be easily found in most SACGs’ web portals, namely IT-9 (Do central government officials have area-defined responsibilities?) and IT-10 (Is the situation of the regional seat of government stated?) with respective mean values 6 and 5.83, and median 6.5 and 3.
In order to more formally identify clusters of GRI items that were found to have a similar accessibility, we further analysed the accessibility of the GRI items in all 12 SA countries’ web portals by means of the aforementioned i-clust algorithm (Revelle 1978, 1979) using the statistical software R. The final cluster structure is shown in Table 2, with the omission of the loadings that are smaller in absolute value than 0.4 to facilitate readability. Also the items are listed in descending order according to the absolute value of the loading in each cluster to facilitate interpretability. Item loadings are the correlation coefficients between the observed items and the found clusters.
It can be seen at first glance that all 75 items are divided into eight complete separate clusters, that need to be studied to obtain an initial adequate interpretation. The denomination assigned to each one of those clusters is in agreement with the item or items that are part of each of these components with which that cluster presents a greater correlation in absolute value (in bold in Table 2). Hence, their assigned names appear below. Also, to facilitate cluster identification, grey and white cells have been used.
Cluster 1 (C1) is defined as Environmental because the most correlated item (IT-69) is about the government concern in energy consumption and environmental efficiency. The negative correlation means that the higher the value of the cluster, the worst is the country in disclosing environmental issues.
Cluster 2 (C2) is designated as Expenditure since the main associated items (IT-33, IT-34, IT-35) are linked with governments’ gross and capital expenditure.
Cluster 3 (C3) is specified as Social as the most connected variables (IT-7, IT-61) describe governments employees’ salaries, competitors and trademarks.
Cluster 4 (C4) is interpreted as a Strategic cluster since the most interrelated variables (IT-3, IT-2, IT-59) are about the Governments’ strategies and main long term goals disclosed.
Cluster 5 (C5) is denominated as Economic because the most linked variable (IT-55) is about the Governments’ subsidies and budgeting issues.
Cluster 6 (C6) is defined as Information as the most correlated variable (IT-15) is about the frequency and dates Governments’ disclose the information in the web platform.
Cluster 7 (C7) is specified as Macroeconomic since the most correlated variable (IT-30) is about macroeconomic published information. The negative correlation means that the higher the value of the cluster the worst is the country in disclosing macroeconomic issues.
Cluster 8 (C8) is described as Organizational because the most correlated variable (IT-5) is from the Governments’ organization perspective. Similarly, to C1 and C7, the negative correlation means that the higher the value of the cluster the worst is the country in disclosing organizational issues.
The presented eight-item-cluster solution has three merits. Firstly, almost all items are highly correlated with only one cluster. Secondly, all items have at least one cluster loading greater in absolute value than 0.5, which is considered to be very significant (Hair et al., 2014). Finally, Table 3 shows that this structure has all cluster internally consistent as Cronbach’s Alpha are bigger than 0.78, which – according to DeVellis (2016) and DeVon et al. (2007) – denotes that a strong internal consistency exists within each item-cluster. Also, all clusters’ Revelle’s Beta have a strong underlying common denominator as all of them are greater than 0.6 but C1, whose value is 0.56, is considered to be adequate (Revelle, 1979).
Alpha and Beta coefficients of internal reliability for each cluster
Once the eight clusters have been extracted, the scores in each item-cluster for the twelve South American countries are calculated (see Appendix D) and these results are presented graphically in Figure 1.
For cluster 1 (Environmental) and cluster 3 (Social) values for all 12 countries are quite similar around zero being the highest value in Suriname. Lowest value is distinctly in Chile for C1, while for C3, the position is shared by Bolivia, Uruguay and Venezuela. It has to be remembered that for C1, higher values in the cluster mean low values in Environmental issues’ disclosure.
Values for cluster 2 (Expenditure), cluster 7 (Macroeconomic) and cluster 8 (Organizational) are clearly dissimilar in most countries. Both C2 and C8 have the highest value in Venezuela, with a value over 100 in C2 and over 150 in C8. However, the lowest is well below 50 in Suriname for C2, and barely below those 50 in Chile for C8. Nevertheless, interpretation for C7 and C8 has to be reversed, as correlations are negative. Therefore, Chile is actually the country with better Organizational disclosure (C8), while Venezuela has the worst behaviour. Also, Uruguay is also the worst behaving in disclosing macroeconomic issues (C7), whereas Suriname achieves the best value.
Values for C7 are the exception in this study because they are noticeably below zero for all 12 SA countries. Uruguay together with Colombia, have the lesser bad values being both over −50, yet Suriname is almost −200. However, the correlations were negative for C7, so that Suriname is the country with best disclosure in Macroeconomic issues, while Uruguay and Colombia are the ones with worst disclosure policy in relation with those issues.
Values for cluster 4 (Strategic), cluster 5 (Economic) and cluster 6 (Information) are around 50, with Brazil, Peru and Suriname reaching, respectively, the best values, while worst values are attained by several countries in C4, by Paraguay in C5 and Peru in C6 (with a below zero value).
The purpose of this study was to describe, under the Global Reporting Initiative (GRI) approach, whether the official web sites of the South American governments are efficient in their transparency, measured trough the dimensions of the defined accountability index, as it has been proven that greater institutional transparency has positive effects on the efficiency of strategy, organization and spending issues, by optimising the use of public resources.
Based on our results, we can conclude that Uruguay and Chile are the best performers from the point of view of accountability, being also the best economic performers in the region, according to their GDP per capita (World Bank, 2018). However, it seems that Uruguay is not so good in macroeconomic issues (C7).
Additionally, Venezuela and Suriname have made an effort to achieve an optimum result in their government web sites scoring the best results in Expenditure (Venezuela) and Social, Information and Macroeconomic issues being disclosed are different to other research (Navarro-Galera, Alcaraz-Quiles & Ortiz-Rodriguez, 2016). Further research is needed to determine whether the accountability index correlates with improvements in the countries’ ICTs increase and disclosed information in their web sites. Moreover, the results in Coy and Dixon (2004) are different mainly because a Delphi test was chosen to measure the quality of the university annual reports. Other issues such as the accounting standard of qualitative report characteristics (timeliness and relevance) are the same.
The contribution of this research is the definition and elaboration of an accountability index based on GRI items, which can enhance the comparison among countries towards benchmarking and re-engineering their web pages. In this sense, transparency and citizen interaction will be improved. Also, the results of the present study contribute to achieve a theoretical and empirical framework for both academics and practitioners regarding accountability measures, since there are scarce studies in that area.
There are some limitations in this study that could be addressed in future research. First, there are some GRI items (e.g., IT-10 or IT-74) that seem not to be intended for national level or measurable in a clear way; hence, the questionnaire may need minor adjustments in further national level studies. Second, although out of the scope of the present paper, further analysis on how the found clusters correlate to the respective SA countries financial and economic variables is needed.
Alcaraz Quiles F.J.; Urquía-Grande E.; Muñoz-Colomina C.I. & Rautiainen A. (2018). E-government implementation: transparency accessibility and usability of government websites. In Alcaide Muñoz L. Rodriguez Bolivar M.P. (Eds.) International E-Government Development 291–306. Palgrave Macmillam Cham.
- Export Citation
Alcaraz Quiles, F.J.; Urquía-Grande, E.; Muñoz-Colomina, C.I. & Rautiainen, A. (2018). E-government implementation: transparency, accessibility and usability of government websites. In Alcaide Muñoz, L., Rodriguez Bolivar, M.P. (Eds.))| false International E-Government Development, 291–306. Palgrave Macmillam, Cham. 10.1007/978-3-319-63284-1_12
Alcaraz Quiles F.J. Navarro A. & Ortiz Rodríguez D. (2014). A Comparative Analysis of Transparency in Sustainability Reporting by Local and Regional Governments. Lex Localis Journal of Local Self-Government 12(1) 55–78.
Armstrong E. (2005). Integrity Transparency and Accountability in Public Administration: Recent Trends International Developments and Emerging Issues. New York: United Nations Department of Economic and Social Affairs.
Bovens M. (2007). Analysing and assessing accountability: A conceptual framework. European Law Journal 13(4) 447–468.
Bernard M. (2002). Examining the effects of hypertext shape on user performance. Usability News 4(2). Retrieved from https://www.researchgate.net/publication/246126441_Examining_the_effects_of_hypertext_shape_on_user_performance.
Blackmon M. Polson P. Kitajima M. & Lewis C. (2002). Cognitive walkthrough for the web. Proceedings Conference on Human Factors in Computing Systems Minneapolis Minnesota. April 20–25 2002 4(1) 463–470. New York: ACM Press.
Borgen F. H. & Barnett D. C. (1987). Applying cluster analysis in counseling psychology research. Journal of Counseling Psychology 34(4) 456–468.
Caba Pérez C. Rodríguez Bolívar P. & López Hernández A.M. (2008). E-government process and incentives for online public financial information. Online Information Review 32(3) 379–400.
Cooksey R. W. & Soutar G. N. (2006). Coefficient beta and hierarchical item clustering: An analytical procedure for establishing and displaying the dimensionality and homogeneity of summated scales. Organizational Research Methods 9(1) 78–98.
- Export Citation
Cooksey, R. W., & Soutar, G. N. (2006). Coefficient beta and hierarchical item clustering: An analytical procedure for establishing and displaying the dimensionality and homogeneity of summated scales.)| false Organizational Research Methods, 9(1), 78–98. 10.1177/1094428105283939
Coy D. & Dixon K. (2004). The public accountability index: Crafting a parametric disclosure index for annual reports. The British Accounting Review 36(1) 79–106.
DeVellis R. F. (2016). Scale development: Theory and applications. 4th edition Applied Social Research Methods Vol. 26. Sage publications.
DeVon H.A. Block M.E. Moyle-Wright P. Ernst D. M. Hayden S.J. Lazzara D.J. Savoy S.M. & Kostas-Polston E. (2007). A psychometric toolbox for testing validity and reliability. Journal of Nursing scholarship39(2) 155–164.
- Export Citation
DeVon, H.A., Block, M.E., Moyle-Wright, P., Ernst, D. M., Hayden, S.J., Lazzara, D.J., Savoy, S.M., & Kostas-Polston, E. (2007). A psychometric toolbox for testing validity and reliability.)| false Journal of Nursing scholarship, 39(2), 155–164. 10.1111/j.1547-5069.2007.00161.x
Filgueiras F. (2016). Transparency and accountability: Principles and rules for the construction of publicity. Journal of Public Affairs 16(2) 192–202.
Finger M. & Cotti L. (2002). Evaluation of E-government efforts in Europe. IDHEAP Working Paper 7/2002.
Florini A. (1999). Does the invisible hand need a transparent glove? The politics of transparency. Proceeding of the Annual World Bank Conference on Development Economics April 28–30 Washington DC: World Bank 163–184.
Fox J. (2007). The uncertain relationship between transparency and accountability. Journal Development in Practice 17(4–5) 663–671.
Glover D. Bennett-Harper S. Alexander D. & Sanniez E. (1999). Assessment of electronic government information products. Prepared under contract by Westat for the United States National Commission on Libraries and Information Science Maryland.
Grant R. & Keohane R. (2005). Accountability and abuses of power in World politics. The American Political Science Review 99(1) 29–43.
Hood C. & Heald D. (eds.) (2006). Transparency: The key to better governance?. Series: Proceedings of the British Academy 135 Oxford University Press for The British Academy.
Huberman B. Pirolli P. Pitkow J. & Lukose R. (1998). Strong regularities in World Wide Web surfing. Science 280(5360) 95–97.
IFAC – International Federation of Accountants (2014). IFAC C. (2014). International framework: Good governance in the public sector. Retrieved from http://www.cipfa.org/policy-and-guidance/standards/international-framework-good-governance-in-thepublic-sector.
IFAC – International Federation of Accountants (2015). Accountability now. Retrieved from https://www.ifac.org/about-ifac/accountability-now.
IFAC – International Federation of Accountants (2018). International public sector financial accountability Index – 2018 Status Report. Retrieved from https://www.ifac.org/about-ifac/accountability-now/international-public-sector-financial-accountability-index.
Ingram W. & Gray E. (1998). A federal standard on electronic media. NTIA Report 98-350 US Department of Commerce.
Jaeger P. & Bertot J. (2010). Transparency and technological change: Ensuring equal and sustained public access to government information. Government Information Quarterly 27(4) 371–376.
Kalbach J. (2002). The myth of ‘seven plus or minus 2’. Dr. Dobb’s Web Review January 14 2002. Retrieved from http://www.ddj.com/documents/s=4058/nam1012431804/.
Kerschot H. & Poté K. (2002). Web-based survey on electronic public services Cap Gemini Ernest and Young e Europe2002.
Kopits G. & Craig J. (1998). Transparency and government operations. International Monetary Fund (IMF) January(158) 1–42.
Lourenço R.P. (2015). An analysis of open government portals: A perspective of transparency for accountability. Government Information Quarterly 32(3) 323–332.
MacCallum R. C. Widaman K. F. Zhang S. & Hong S. (1999). Sample size in factor analysis. Psychological Methods 4 84–99.
Milic-Frayling N. Jones R. Rodden K. Smyth G. Blackwell A. & Sommerer R. (2004). Smartback: supporting users in back navigation. Proceedings of the 13th International Conference on World Wide Web 63–71.
Navarro-Galera A. Alcaraz-Quiles F. & Ortiz-Rodríguez D. (2016). Online dissemination of information on sustainability in regional governments. effects of technological factors. Government Information Quarterly 33(1) 53–66.
Pina V. Torres L. & Royo S. (2009). E-government evolution in EU local governments: A comparative perspective. Online Information Review 33(6) 1137–1168.
Porter J. (2003). Testing the three-click rule. User Interface Engineering. Retrieved from http://www.Uie.com/articles/three_click_rule/.
Revelle W. (1978). ICLUST: A cluster analytic approach to exploratory and confirmatory scale construction. Behavior Research Methods 10(5) 739–742.
Revelle W. (1979). Hierarchical cluster analysis and the internal structure of tests. Multivariate Behavioral Research 14 57–74.
Revelle W. R. (2017). psych: Procedures for personality and psychological research. Retrieved https://cran.r-project.org/web/packages/psych/index.html
Revelle W. & Zinbarg R. E. (2009). Coefficients alpha beta omega and the glb: Comments on Sijtsma. Psychometrika 74(1) 145–154.
Ritter D. (2002). LabVIEW GUI: Essential techniques. New York: McGraw-Hill.
Roussos L. A. Stout W. F. & Marden J. I. (1998). Using new proximity measures with hierarchical cluster analysis to detect multidimensionality. Journal of Educational Measurement 35(1) 1–30.
Salas Quirós L. (2015). La rendición de cuentas en la gestión del sector público centralizado de Costa Rica. Ph.D. Dissertation. Retrieved from https://eprints.ucm.es/28060/.
Scott D. & Meijer A. (2016). Transparency and public value - Analyzing the transparency practices and value creation of public utilities. International Journal of Public Administration 39(12) 940–951.
Sijtsma K. (2009). On the use the misuse and the very limited usefulness of Cronbach’s alpha. Psychometrika 74(1) 107–120.
Stiglitz J. (2002). Transparency in government. In Islam R. Djankov S. & McLeish C. (eds) The right to tell role of mass media in economic development (pp. 27–44). Washington DC: The International Bank for Reconstruction and Development - The World Bank.
West D. M. (2002). Global e-government. Center for Public Policy Brown University Providence RI. https://www.digitale-chancen.de/transfer/downloads/MD372.pdf
Wong W. & Welch E. (2004). Does E-government promote accountability? A comparative analysis of website openness and government accountability. Governance 17(2) 275–297.
World Bank (2018). GDP per capita PPP (current international $). Retrieved from https://data.worldbank.org/indicator/NY.GDP.PCAP.PP.CD?locations=ZJ&most_recent_value_desc=true&view=map
Zhang J. (2013). A procedure for dimensionality analyses of response data from various test designs. Psychometrika 78(1) 37–58.
Zhang J. & Stout W. (1999). The theoretical DETECT index of dimensionality and its application to approximate simple structure. Psychometrika 64(2) 213–249.
Zaphiris P. (2000). Depth vs. breadth in the arrangement of web links. Proceedings of the Human Factors and Ergonomics Society Annual Meeting Vol. 44 (4) 453–456. Los Angeles CA: SAGE Publications.
Zeldman J. (2001). Taking Your Talent to the Web: A Guide for the Transitioning Designer. Indianapolis Indiana: New Riders.
|1. Is a statement made by the Head of Government on the importance of sustainability for the central government and its strategy?|
|2. Does this statement set out priorities, strategies and key factors for the short-medium term?|
|3. Does this statement address long-term trends relevant to priorities concerning sustainability?|
|4. Does this statement include events, achievements and failures during the period in question?|
|5. Does this statement include goals-oriented performance perspectives?|
|6. Does this statement include challenges and targets for the coming year and the forthcoming 3–5 years?|
|7. Does the central government own trademarks?|
|8. Are different areas clearly defined?|
|9. Do central government officials have area-defined responsibilities?|
|10. Is the situation of the regional seat of government stated?|
|11. Is a statement made of the number of countries in which significant activities are carried out?|
|12. Is the number of employees stated?|
|13. Have significant changes taken place in the central government structure or size?|
|14. Has the central government been awarded prizes or other recognition during the period in question?|
|15. Is a statement made of the period corresponding to the information supplied?|
|16. Is the date of publication of this information stated?|
|17. Is the presentation frequency of this information stated?|
|18. Is there a liaison person for questions concerning the supplied information?|
|19. Does the supplied information include dates of specific interest for suppliers and users?|
|20. Is priority assigned to the aspects addressed in the supplied information?|
|21. Is there a given person or government body responsible for defining organization strategy?|
|22. Does the chief official hold any other public or private post?|
|23. Do there exist works’ committees or workers’ representatives?|
|24. Are the stakeholders included in the supplied information?|
|25. Does the presented information include the government program?|
|26. Are the Government program commitments met?|
|27. Has the ruling party an absolute majority?|
|28. Are stakeholder selection and identification criteria included in the supplied information?|
|29. Is an expenditure forecast/beneficiary population published?|
|30. Is a revenue forecast/beneficiary population published?|
|31. Are revenues transferred from other public administrations/total revenues published?|
|32. Tax pressure|
|33. Is gross expenditure, detailed by type of payment, published?|
|34. Is gross expenditure, detailed by financial classification, published?|
|35. Is capital expenditure, detailed by financial classification, published?|
|36. Are the services costs disclosed?|
|37. Average payment period|
|38. Are the current competitions disclosed?|
|39. Is the contractor profile disclosed?|
|40. Future services calls|
|41. Is the policy on internal promotion published?|
|42. Are staff training facilities published?|
|43. Indebtedness capacity|
|44. Is a statement made of future financial risk?|
|45. Equity and assured goods|
|46. Is a report published on the expenditure forecast?|
|47. Are data given on received subsidies?|
|48. Are Financial Statements disclosed?|
|49. Is information about accounting policies disclosed?|
|50. Is expense budget disclosed?|
|51. Does the latter include medium-term perspectives?|
|52. Are the following key economic assumptions and forecast made public: GDP growth, employment, unemployment, inflation and rates of interest?|
|53. Is the offer of services made public?|
|54. Social services expenses|
|55. Is a subsidies announcement made for business activities?|
|56. Is a statement made on pensions obligations to employees?|
|57. Are grants offers to neighbourhood associations made public?|
|58. Are offers of public employment made public?|
|59. Are grants offers to NGOs made public?|
|60. Are indicators of effectiveness and efficiency published?|
|61. Initial salary/Local minimum salary|
|62. Local supplier expense/Total expense|
|63. Is information published on the initiatives taken to alleviate the environmental impact of products and services?|
|64. Is the degree of reduction of the above impact stated?|
|65. Is a statement made of the direct consumption of energy obtained from primary sources?|
|66. Is a statement made of the consumption of intermediate energy?|
|67. Is a statement made of the actions taken to increase savings via conservation or increased efficiency?|
|68. Is information published on initiatives taken to promote products and services that are energy efficient or based on the use of renewable energies?|
|69. Is information published on reductions in energy consumption as a result of the above initiatives?|
|70. Is information published on the initiatives taken to reduce indirect energy consumption?|
|71. Is information published on reductions achieved by the above initiatives?|
|72. Is information published on the different sources of water supply employed, and the volume obtained from each source?|
|73. Is information published on the percentage and total volume of water that is recycled and reused in the community?|
|74. Is information published on the disposal of waste water by the community?|
|75. Is information published on the total and type of expenditure on environmental investment?|