Radosław Januszewski, Rafał Różycki and Grzegorz Waligóra
 Greenberg S., Mills E., Tschudi B., Rumsey P., Myatt B., Best Practices for Data Centers: Lessons Learned from Benchmarking 22 Data Centers, 2006 ACEEE Summer Study on Energy Efficiency in Buildings, 2006.
 HPL: High Performance Linpack, used optimized intel HPL implementation: https://software.intel.com/en-us/articles/intel-mkl-benchmarks-suite.
 iDataPlex: https://lenovopress.com/tips0878-idataplex-dx360-m4.
 Iyengar M., David M., Parida P., Kamath V., Kochuparambil B
The goal of this paper is to give a specification of the software framework that evaluates the efficiency of different mechanisms for concurrent processes, notably process synchronization mechanisms. The paper discusses the concept of this framework, the potential users of it and some necessary considerations, including assumptions. Further, it defines general requirements for the framework and presents its desired conceptual design. The conclusions and possible directions for future work end the paper.
Results of investigation of face detection algorithms efficiency in the banking client visual verification system are presented. The video recordings were made in real conditions met in three bank operating outlets employing a miniature industrial USB camera. The aim of the experiments was to check the practical usability of the face detection method in the biometric bank client verification system. The main assumption was to provide a simplified as much as possible user interaction with the application. Applied algorithms for face detection are described and achieved results of face detection in the real bank environment conditions are presented. Practical limitations of the application based on encountered problems are discussed.
Among the data clustering algorithms, k-means (KM) algorithm is one of the most popular clustering techniques due to its simplicity and efficiency. However, k-means is sensitive to initial centers and it has the local optima problem. K-harmonic-means (KHM) clustering algorithm solves the initialization problem of k-means algorithm, but it also has local optima problem. In this paper, we develop a new algorithm for solving this problem based on an improved version of particle swarm optimization (IPSO) algorithm and KHM clustering. In the proposed algorithm, IPSO is equipped with Cuckoo Search algorithm and two new concepts used in PSO in order to improve the efficiency, fast convergence and escape from local optima. IPSO updates positions of particles based on a combination of global worst, global best with personal worst and personal best to dynamically be used in each iteration of the IPSO. The experimental result on five real-world datasets and two artificial datasets confirms that this improved version is superior to k-harmonic means and regular PSO algorithm. The results of the simulation show that the new algorithm is able to create promising solutions with fast convergence, high accuracy and correctness while markedly improving the processing time.
The rapid development of technology has allowed computer simulations to become routinely used in an increasing number of fields of science. These simulations become more and more realistic, and their energetic efficiency grows due to progress in computer hardware and software. As humans merge with machines via implants, brain-computer interfaces and increased activity involving information instead of material objects, philosophical concepts and theoretical considerations on the nature of reality are beginning to concern practical, working models and testable virtual environments. This article discusses how simulation is understood and employed in computer science today, how software, hardware and the physical universe unify, how simulated realities are embedded one in another, how complicated it can get in application, practical scenarios, and the possible consequences of these situations. A number of basic properties of universes and simulations in such multiply nested structures are reviewed, and the relationship of these properties with a level of civilizational development is explored.
One of the most important topics in the research concerning 3D local descriptors is computational efficiency. The state-of-the-art approach addressing this matter consists in using keypoint detectors that effectively limit the number of points for which the descriptors are computed. However, the choice of keypoints is not trivial and might have negative implications, such as the omission of relevant areas. Instead, focusing on the task of single object detection, we propose a keypoint-less approach to attention focusing in which the full scene is processed in a hierarchical manner: weaker, less rejective and faster classification methods are used as heuristics for increasingly robust descriptors, which allows to use more demanding algorithms at the top level of the hierarchy. We have developed a massively-parallel, open source object recognition framework, which we use to explore the proposed method on demanding, realistic indoor scenes, applying the full power available in modern computers.
Transactional Memory (TM) is an alternative way of synchronizing concurrent accesses to shared memory by adopting the abstraction of transactions in place of low-level mechanisms like locks and barriers. TMs usually apply optimistic concurrency control to provide a universal and easy-to-use method of maintaining correctness. However, this approach performs a high number of aborts in high contention workloads, which can adversely affect performance. Optimistic TMs can cause problems when transactions contain irrevocable operations. Hence, pessimistic TMs were proposed to solve some of these problems. However, an important way of achieving efficiency in pessimistic TMs is to use early release. On the other hand, early release is seemingly at odds with opacity, the gold standard of TM safety properties, which does not allow transactions to make their state visible until they commit. In this paper we propose a proof technique that makes it possible to demonstrate that a TM with early release can be opaque as long as it prevents inconsistent views.
Wireless Sensor Network (WSN) has emerged as an important supplement to the modern wireless communication systems due to its wide range of applications. The recent researches are facing the various challenges of the sensor network more gracefully. However, energy efficiency has still remained a matter of concern for the researches. Meeting the countless security needs, timely data delivery and taking a quick action, efficient route selection and multi-path routing etc. can only be achieved at the cost of energy. Hierarchical routing is more useful in this regard. The proposed algorithm Energy Aware Cluster Based Routing Scheme (EACBRS) aims at conserving energy with the help of hierarchical routing by calculating the optimum number of cluster heads for the network, selecting energy-efficient route to the sink and by offering congestion control. Simulation results prove that EACBRS performs better than existing hierarchical routing algorithms like Distributed Energy-Efficient Clustering (DEEC) algorithm for heterogeneous wireless sensor networks and Energy Efficient Heterogeneous Clustered scheme for Wireless Sensor Network (EEHC).
The evaluation of the scientific research projects is an important procedure before the scientific research projects are approved. The BP neural network and linear neural network are adopted to evaluate the scientific research projects in this paper. The evaluation index system with 12 indexes is set up. The basic principle of the neural network is analyzed and then the BP neural network and linear neural network models are constructed and the output error function of the neural networks is introduced. The Matlab software is applied to set the parameters and calculate the neural networks. By computing a real-world example, the evaluation results of the scientific research projects are obtained and the results of the BP neural network, linear neural network and linear regression forecasting are compared. The analysis shows that the BP neural network has higher efficiency than the linear neural network and linear regression forecasting in the evaluation of the scientific research projects problem. The method proposed in this paper is an effective method to evaluate the scientific research projects.
Pareto set upper approximations. In: Multiple Criteria Decision Making ’11, eds. T. Trzaskalik, T. Wachowicz, The University of Economics in Katowice, in print.
Kaliszewski I., Miroforidis J., Podkopayev D. (2011), Interactive Multiple Criteria Decision Making based on preference driven Evolutionary Multiobjective Optimization with controllable accuracy - the case of -efficiency. Systems Research Institute Report, RB/1/2011.
Kaliszewski I., Miroforidis J., Podkopayev D. (2012), Interactive Multiple Criteria Decision Making based on