BitTorrent Sync Application

BitTorrent Sync Application
BitTorrent Sync Application

Want help to write your Essay or Assignments? Click here

BitTorrent Sync Application

Summary

            Given that private and professional internet users have rapidly become concerned with privacy and data protection the privacy that is accorded famous synchronization services of cloud files for instance Google drive, Drop box and one drive this matter has gained lots of scrutiny by the press.

Some of these services have been reported in the recent past to be sharing their information with government intelligence agencies without warrants/ BitTorrent has been observed to be an alternative by numerous people and by 2014 it already had 3 million users. 

This service is totally decentralized and provides most of the same functionality and synchronization that is a replica of cloud computing services. It employs encryption in the transmission of data and alternatively for storage remotely.

The significance of comprehending BitTorrent Sync and its consequent digital investigative implications to law enforcers and forensic investigators is critical to future evaluations.  Given the rise in home user bandwidth and the developments in professional and non professional computer power data volumes that are being created by computer machines are now increasing.  

For users of mobile machines accessing this data has been a tall order. With rapid connectivity and increased availability and internet access the notion of resilient storage,  high availability and off site back up has been moved from the domain that was mainly a preserve of large corporations and has become rapidly popular with users of computers as well as daily data consumers.

Want help to write your Essay or Assignments? Click here

Applications like Drop Box and Ever note control the reducing cost of hard disk storage witnessed in Storage as a Service (SaaS) providers for instance Amazon S 3. The key advantage of stemming from services like Apple iCloud, DropBox, Microsoft OneDrive, and Google Drive is the fact that their data tends to be stored in a fundamental extension of their machines with no necessity of user interaction upon installation.

It is backed up by a completely distributed data center model that is totally out of the financial reach of the ordinary consumer. This data can be accessed on various devices without necessarily reformatting partitions or wasting space through a creation of multiple copies of one file for all devices. DropBox and a number of other services have offline applications that permeate the synchronization of data to local folders to be accessed offline.

Want help to write your Essay or Assignments? Click here

Each of these services could be categorized as cloud synchronization services. This shows that as much as data is coordinated between user machines a copy of this data is remotely stored in the cloud. Most of this data can be freely accessed by government intelligence without a warrant.  Consequently, BitTorrent Sync offers this functionality without cloud storage and is thus believed to be the best option.

Given its advantages BitTorrent Sync application is now popular with those who want to replicate and synchronize their files. By the end of 2013 it had attracted 2 million users. This  work’s contribution entails a forensic evaluation of  BitTorrent Sync applications for clients left remnants upon installation, its behavior, and artifacts.  An evaluation of the network traffic sequence and file interactions employed as part of the process of interaction are also included.

This information can be of use to digital forensic investigators if BitTorrent Sync is installed on machines that have been subjected to investigation. It can be employed in the recovery of lost data, modifying this data or locating where it has been synchronized to.

Want help to write your Essay or Assignments? Click here

 By knowing the operational nature of BitTorrent Sync this could assist to get the right direction during digital investigations to extra remote machines where more relevant data could be replicated. This technology is applicable in several crime investigations including, industrial espionage, malicious software distribution and sharing of child exploitation material. The crime that is being investigated is what determines if the remote machines could be owned and put under operation by one suspect or by a cohort that has a mutual goal. 

The protocol is powerful because of the nature of file parts usage where each file is capable of being manipulated and controlled separately. Given that BitTorrent Sync at times tends to use a DHT in data transfers there is no leading authority that can be used to manage data authentication. Suspect files located on a system could be downloaded from several sources and uploaded to several recipients.

Want help to write your Essay or Assignments? Click here

Analysis

There are three unique settings that establish the resources employed in peer discovery as well as the available path for transmission of traffic. BitTorrent Sync employs such peer discovery techniques to ordinary BitTorrentprotocol.  The localized peer discovery packet contains a BSYNC and a ping message type which includes the 20 byte mutual ID of the advertised share, IP address of the sender host as well as the port.

The LAN hosts that receive the packet will drop it if they have no interest in the shared ID.  Hosts that show some interest show it through a UDP packet response to the advertised port.  This response lacks a BSYNC header and the field of data has only the responding peer’s ID.

The three IP addresses are hosted on the EC2cloud service of Amazon. A get peers request is send to the tracker server by the client. Upon reception of this request the IP address of the client is added onto a list of available peers that are active for the specific tracker shared ID. Given that the client requests this list because of the secret possessed by it, the server’s response always has one active peer which is employed in requesting the client’s information.

Peer discovery can be done by clients through the use of a Distributed Hash Table (DHT). In this alternative peers can register their details by use of the secret word and the port. By employing this alternative users could avoid employing any kind of tracking server but they could find peer discovery as being slow in nature.

Want help to write your Essay or Assignments? Click here

The option of using predefined hosts is the last and undetectable technique of peer discovery.  The user is capable of adding a list of IP addresses and port combinations to shared preferences.  This peer’s list is likely to be directly contacted without necessarily having to look into a BSYNC packet that ahs a ping type of message.  Just like peer discovery methods BTSync permeates the user to put together several alternatives that impact the manner in which data can be transferred between peers.  If no options are set the seeding host makes attempts to directly communicate with the replicating target.

The forensic analyses of these utilities are problematic in nature. Unless there is a complete up to date local synchronization the data’s entire picture could reside in temporary files. Multiple data centers and volatile storage for instance the RAM of the system.

Any forensic evaluation done digitally on these systems should have particular attention on the access method. This is mainly the internet browser that connects to the access page of the service provider.  This temporary access highlights the significance of real forensic methods while carrying out investigations on a suspected machine. 

Want help to write your Essay or Assignments? Click here

If power is disconnected from the suspect’s machine investigators will lose more than access to the clients open documents and this includes authentication documents stored in the RAM such as passwords. There is an extra approach to forensics during cloud storage investigations. This entails access by use of complete client application regardless of whether it has been interfered with by the user. 

Anti-forensic attempts could be applied through a deletion of synchronized folders and uninstallation of the application.  If Dropbox is employed with the application of the client it creates a localized folder which synchronizes contents that are stored in it through an online duplication of the folder. Dropbox usually provides a storage space of 2GB for free but one has the alternative of buying more space.

OneDrive is meant to be an absolute online storage facility with the alternative of synchronizing copies of files to the folder of a client.  Most cloud storage facilities offer a method of synchronization that involve a kind of periodic checking to establish if there are changes made to versions that are viewed at a local level. They can also be used to make a comparison of online and offline copies upon re-establishing communication.

Just like peer discovery methods, BitTorrent Sync enables users to configure several options that impact the manner in which data gets transferred from one peer to another. The seeding host makes attempts of directly communicating with the replication target. The traffic tends to be encrypted through default in case it travels out of the local LAN. There is an alternative in application preferences to enhance LAN encryption if it is preferred by the user.

Want help to write your Essay or Assignments? Click here

If there is a blocked communication between hosts in case they are on different networks safeguarded by firewalls or in LAN segments which are locked by inbound access managed links. When a seeder creates a share it leads to the generation of a master key. It is essentially the all access key that permeates the share owner to modify, add or remove the share contents. The only case in which the key can be distributed to another one is when it becomes a trusted collaborator.  The read only key permeates the receiving user to read the synchronized data but not to modify the source contents whatsoever.

When trying to recover lost data it is possible to find that BitTorrent Sync has been installed on a machine. As a result if anti-forensic measures a number of files may not be recovered from the localized hard disk. If the secret is recovered for some share it is likely that the suspect’s secret synchronization will help forensic investigators to recover the information that is lost from all nodes in the share that are still active.

Ordinary forensic analysis of file systems tends to identify synchronized artifacts that are left behind from a certain share combined with successive data synchronization.  The collected data should be stored in a digital evidence bag.

Want help to write your Essay or Assignments? Click here

Usability Engineering of Cognitive Applications

usability engineering
COGNITIVE APPLICATIONS

Want help to write your Essay or Assignments? Click here

CHILDREN BCI EXPERIENTIAL IMPACT ON USABILITY ENGINEERING OF COGNITIVE APPLICATIONS

1.1 Emotional importance in usability of cognitive application

1.1.1    Children Cognitive Application

Understanding the emotions of human beings is important as it can help to tell how people usually think. To properly study the human emotions, then children aged 4-6 years old can be used in the study. Children in this age bracket are particularly important for use in the study since they cannot tell what really emotion or feeling is from the things they interact with (Ekman, 1992). In essence, the main aim of this study is to delve into the impact of the emotional of children in usability of technologies designed for children aged from 4 to 6 years old. The thesis also seeks to evaluate the usability of cognitive application – based on the children emotions at three stages in the software development process.

1.1.2    Brain Computer Interfaces (BCI) technology

Brain computer interface (BCI) headset technology would be used as a pathway between human and computer, and it will determine the emotion of the user – such as engagement/boredom, frustration, meditation, instantaneous excitement, and long-term excitement – to really understand the emotion of the target user and to predicate the effectiveness of these emotions in usability engineering of this game based on tree cycles testing. From a usability perspective, the researcher is interested in the following aspects:

(i) Effectiveness: the relative effectiveness of different mnemonic actions of children to reach an intended goal;

(ii) efficiency: time taken to complete tasks related to allocation of resources and usage; and

(iii) satisfaction: measures emotional of children reaction by the Emotiv Epoc headset in different emotions such as instantaneous excitement, long term excitement, meditation, engagement and frustration.

This research is particularly important because of the outcomes that would come out of it: the findings will reveal the children emotional impact in usability engineering of cognitive applications with the use of BCI headset. Moreover, the findings will reveal the emotional reactions of children, the usability engineering methods, and the brain-computer interaction technology; the results will also reveal what the suitable interactive design of memory games for children is; and the extent to which the designed game meet the usability requirements by expert review evaluation and heuristic inspection of experts.

Emotions are of great importance in enhancing or improving any system interaction (Brave & Nass, 2003). Previously, systems used to be developed aesthetically and with no regard or response to the emotional influence which they possessed (Papanek, 1985). In addition, system designers have reported that in the past, interactions with technology, computers in particular, were sterile and unemotional (Brave & Nass, 2003).

Nevertheless, design philosophers, scholars, neuroscientists and psychologists have pointed out that nowadays, emotion plays an integral role in how people interact with technology, which include computers as well as the interfaces that have developed to interact by means of this medium. According to (McCarthy & Wright, 2004), designers of interfaces and interactive systems need to recognize and centralize the emotional-volitional nature of any system.

In addition, it is important that designers understand they do not design emotions, but instead they design for the optimum experience that results from personal interaction with the objects experienced in everyday life. It is worth mentioning that an extensive array of emotions play influential roles in almost every goal-oriented activity (Brave & Nass, 2003).

Emotions are essentially built from plain reactions which easily promote the survival of an organism, hence could easily succeed in evolution (Damasio, 2001). Interestingly enough, (Damasio, 2001) gave a description about the ordering of feelings and emotions. Damasio (2001) pointed out that emotions managed to withstand the evolution test. He added that first, human beings have emotions, and then feelings come second after emotions given that evolution initially came up with emotions and later on feelings followed (Damasio, 2001).

In general, human beings ultimately concern themselves with emotions, those which are made public. Emotions are actions that take place mainly in the public as they are visible to other people considering that they occur on a person’s voice, face, or even in certain conducts. Conversely, feelings of a person are normally concealed, like all mental images necessarily are, and are hidden to anyone save for their owner, the most private property of the organism in whose brain they take place (Damasio, 2001).

Emotion is one of the integral elements that are involved in education and learning (Parkinson, 1996). It affects a person’s decision making, communication and even a person’s capacity to learn. Parkinson (1996) pointed out that emotions influence the decisions that individuals make, how effectively they learn and the way they communicate with other people. Psychologists define emotion as a disorganized, intuitive response, which is caused by a lack of effective adjustment (Cannon, 1927; Schachter & Singer, 1962).

Valence is understood as the amount of negativity or positivity that an individual feels toward something. Conversely, arousal is considered as what gets the attention of an individual. In the field of computing, emotion is integral considering that it has the potential of influencing the effectiveness of learning (McCarthy & Wright, 2004). In this research study, the researcher will look into the kinds of situations which bring about emotions within a learning environment.

As such, in this project, the researcher proposes to utilize an electroencephalography (EEG) device known as the Emotiv EPOC – as conducted in the Software and Knowledge Engineering Research Group (SKERG) at King Saud University – to sense or perceive the emotions of a user through brainwaves in cognitive application game. This will allow the researcher to determine positive or negative emotional impact of this game on children and to establish and understand the usability of these kinds of cognitive applications in childhood.

EEG is essentially an electrophysiological monitoring technique for recording the brain’s electrical activity. This monitoring method is usually non-invasive as the electrodes are placed along the scalp. In specific applications however, invasive electrodes are at times utilized (Tatum, 2014). EEG measures fluctuations of voltage that result from ionic current in the brain’s neurons. Emotiv EPOC – developed by Emotiv Systems – is a neuroheadset which lets the players to control game-play with their emotions, expressions and even their thoughts. It is worth mentioning that the Emotiv EPOC, as Shende (2008) pointed out, is an innovative and pioneering high-fidelity Brain-Computer Interface device for the video game market.

The neuroheadset itself is an easy-to-utilize, glossy and lightweight wireless device that features a number of sensors which are capable of detecting conscious thoughts, expressions, as well as non-conscious emotions basing upon electrical signals around the person’s brain (Shende, 2008). The technology basically processes these signals, allowing the players to be able to control the actions or expressions of their in-game character and influence game-play with the use of their emotions, expressions and thoughts.

The Emotic EPOC can non-invasively detect brain activity with the use of EEG, a measure of brain waves, through external sensors all along the individual’s scalp which detect the electrical bustle in different areas of the furrowed surface of the cortex of the brain, a section which is responsible for handling higher-order thoughts (Sergo, 2008).

The Emotiv EPOC can detect in excess of thirty dissimilar emotions, expressions as well as actions including emotional detections like frustration, exctitement, immersion, tension and meditation; facial expressions like anger/eyebrows furrowed, wink, shock/eyebrows raised, wink, smirk, grimace/clenched teeth, horizontal eye movement, and smile; and cognitive actions like rotate, drop, push, lift, pull on 6 dissimilar axis (Shende, 2008). Owing to these detections, the player enjoys a more lifelike, immersive experience.

Want help to write your Essay or Assignments? Click here

1.1.3    Software Usability Engineering

Emotions, as Parkinson (1996) pointed out, are one of the most important factors for creating highly developed educational systems that are adaptive to the needs of the user. Emotions are vital in a lot of areas of learning including creative thinking, motivation, concentration, and even inspiration. A big part of the presently available educational systems do not consider the effects which the emotions of a user could really have on their learning. As such, this study will improve the usability of user interfaces by applying the tree cycle to measure the user’s emotion in each experiment.

Usability Engineering (UE) is understood as the methodical approach that is used to improve the usability of user interfaces by applying various established techniques during the system development lifecycle (SDLC) (Nielsen, 1993). Five qualities of usability have been identified by (Nielsen, 1993), which include efficient to utilize, easy to learn, error prevention, easy to remember, and satisfying.

As per the standard document ISO 9241-11, usability should cover 3 important things: satisfaction, efficiency and effectiveness. Usability is essentially defined as the degree to which a product could be utilized by specific users to attain specific objectives with satisfaction, efficiency, and effectiveness within a specific context of use.

Satisfaction: this encompasses positive emotions, attitudes and comfort that rise from the utilization of a given service, product or system. Attitudes comprise the degree to which the expectations of the users are attained. An individual user’s satisfaction is a part of his or her experience. This measure of usability is measured using a brief questionnaire basing upon Lewis (1991).

Efficiency: according to ISO 9241, efficiency is defined as the total resources that are used up in a given task. It is the relationship between the outcome attained and the resources utilized. This measure of usability is measured through task times. The metrics of efficiency include the number of keystrokes or clicks which are needed or the total time on task. In general, the task needs to be defined from the perspective of the user and not as a single, granular interaction (Nielsen, 1993).

Navigation design aspects for instance links, menus, keyboard shortcuts, in addition to other buttons have an impact on efficiency. When the designer designs them very well, with actions that are expressed clearly, then less amount of effort and time would be required for users to make action and navigation choices. All in all, making the correct choices for efficient utilization of the software is contingent upon an understanding of the users and the way users prefer working.

Effectiveness: this is understood as the completeness, accurateness, and lack of negative outcomes with which the user achieves specific goals (Lewis, 1991). Effectiveness is established by examining whether or not the goals of the user were attained successfully and whether all work is correct. The usability measure of effectiveness is measured through the number of errors and also through task completion.

ISO 9241-11 describes how to find the information needed to consider when assessing or spelling out usability in terms of measures of user satisfaction and performance. There is explicit guidance on how to explain the context of usage of the product as well as the measures of usability.

Usability testing will be conducted to validate the research using the Emotive EPOC headset tool. The levels of effectiveness, efficiency and satisfaction will represent the usability of a cognitive application in the field; the memory game would be designed and implemented by the researcher. The results obtained would show the viably of the approach adopted to conduct a usability testing of a computer game.

1.2 Problem Definition

Emotions are crucial in improving any system interaction. Researchers have reported that emotion plays a fundamental role in the way that individuals interact with technology such as computers. Using youngsters aged 4-6-years-old, this study seeks to gain an insight into the impact of the emotional of children in usability of technologies designed for children aged from 4 to 6 years old.

The researcher will also investigate to determine the impact of these emotions in usability of this game with usability engineering by using BCI headset because this target – the 4-6 year old children – cannot tell really what they feel. The researcher proposes to employ an electroencephalography (EEG) device, the Emotiv EPOC, in detecting the emotions of a user through brainwaves in the cognitive application game to demonstrate that the emotional responses of people could actually vary.

The problem is to understand the impact of children’s emotion in the cognitive application game: children aged 4-6 years. This is significant considering that children cannot really explain their emotion. Therefore this research would help to determine the impact from the children’s minds rather than through conversation since these children cannot actually give good explanation of their young age. Moreover, the researcher will determine how this cognitive application game can be used to improve learning in children rather than just using this innovative technology without any benefits to the users.   

1.3 Research Scope

Target user

Target users are basically the individuals or persons who are expected to use the device the researcher is proposing in the study. This study will focus on samples of children aged from 4 to 6 years old from Saudi Arabia; hence the sample will comprise Saudi children only from the Saudi society. These children are the target users. The researcher plans to use them to determine their emotions in usability of technologies designed for them.

Hardware / Software

In this study, usability testing will be conducted for the purpose of validating the research using the Emotive EPOC headset tool. The levels of effectiveness, efficiency and satisfaction will represent the usability of a cognitive application in the field of memory game, which would be designed and implemented by the researcher in this research project. It is worth mentioning that the results that would be obtained may show the viably of the approach adopted to carry out a usability testing of a computer game.

1.4 Aims and Objectives

The major aim of this research study is to gain insight into the impact of the emotions of youngsters in usability of technologies designed for children aged from 4 to 6 years old. In addition, this research study seeks to evaluate the usability of cognitive application – based on the children emotions at three stages in the software development process.

  • Preliminary Study:

Study1: Design a memory game for children emotional impact Interaction (Low-fidelity prototype phase).

  • Study2: Usability evaluation for the Low-fidelity prototype game by Expert Review and heuristic Inspection (High-fidelity prototype phase).
  • Study3: Usability evaluation for the High-fidelity prototype game by BCI headset and Cognitive walkthrough Inspection (High-fidelity prototype phase Improve).

1.5 Research Questions

The main question of this research study is: What is the most effectively Children emotional Impact in Usability Engineering of Cognitive Applications using BCI headset?

The sub-research questions as the following:

  1. What are the children emotional reactions, the usability engineering methods and what is the brain computer interaction technology?

How the researcher will understand and investigate the terms of the application for this research is articulated in Chapter 2 – Literature Review.

  • What is the suitable interactive design of memory games for children?

The empirical study 1 will be carried out as reported in Chapter 3 – Study 1: Designing a memory game for children emotional impact Interaction.

  • What extent does the designed game meet the usability requirements by expert review Evaluation and heuristic Inspection of experts?

From a usability perspective, the researcher is interested in the following aspects: effectiveness: the relative efficacy of different mnemonic actions of experts to reach an intended aim; efficiency: time taken to carry out and finish tasks relating to allocation of resources and usage; and satisfaction: measures in quantitative surveys from experts (Lewis, 1991). The researcher will investigate applying usability engineering activity to evaluate the Low-fidelity prototype game; this study 2 will be examined exhaustively and reported in Chapter 4 – Study 2: Expert Review and heuristic Inspection.

  • How can conducting UE with BCI technology evaluation?

From a usability perspective, the researcher is interested in the following aspects: effectiveness: The relative effectiveness of different mnemonic actions of children to reach an intended goal. Efficiency: Time taken to complete tasks related to allocation of resources and usage. Satisfaction: measures emotional of children reaction by the Emotiv Epoc headset in different emotions such as instantaneous excitement, long term excitement, meditation, engagement and frustration.

The researcher will investigate applying different usability engineering activities different from Study 2 with BCI technology to evaluate the High-fidelity prototype game; this Study 3 will be examined fully and reported in Chapter 4 – Study 3: Cognitive walkthrough.

1.6       Research Methodology

The aim and objectives would be achieved through three interrelated studies. The Masters of Science thesis emphasizes concepts and processes related to usability engineering. The design and development of the cognitive game done by the researcher goes beyond the scope of the MSc thesis. The three studies in the project lifecycle are:

  • Study 1: designing a memory game for children emotional impact interaction

The researcher conducted previous study in designing cognitive game for children – case study – and with semi-structured interviews with neurologists, psychiatrists and education specialists in order to gather information about the current practice in memory game.

  • Study 2: Expert Review and heuristic Inspection

The researcher conducted usability evaluation inspection methods on Low-fidelity prototype. The experiment focused on the usability of interface/interaction design to engage the expert in the side of cognitive program. Intensive validity testing sessions have been conducted in every field, and challenges faced by the different user groups iteration.

  • Study 3: Cognitive walkthrough

The researcher conducted usability engineering processes during the development iterations of the game. The Emotiv Epoch EEG headset and Windows platform were selected for the development of the application to ensure usability for the different children groups. The development focused on both the emotion impacts and usability of interface/interaction design to engage children in the cognitive program. Intensive validity testing sessions have been conducted in all development iterations.

1.7 Outline of the Thesis (Document Structure)          

Chapter 2 is a review of literature related to the children emotions and usability engineering methods and usability aspects, brain computer interface (BCI) technology and the Emotiv Epoc headset tool. In Chapter 3, Preliminary Study that covers the designing of a memory game for children emotional impact Interaction is discussed. Chapter 4 describes Study 2, the usability evaluation for the Low-fidelity prototype game by Expert Review and heuristic Inspection. I

n Chapter 5, the researcher discusses Study 3 by presenting usability evaluation for the High-fidelity prototype game by BCI headset and Cognitive walkthrough Inspection. Chapter 6 provides a discussion of the results from each of the 3 studies. The thesis is concluded in Chapter 7 by presenting how all the objectives of the three studies have been achieved during the work in this thesis.

References

Brave, S., & Nass, C. (2003). Emotion in Human–Computer Interaction”. In J. Julie & A. Sears (Eds.), The Human-Computer Interaction Handbook. (1st. ed., pp. 81-96). Hillsdale: Lawrence Erlbaum Associates.

Cannon, W. B. (1927). The James-Lange theory of emotion: A critical examination and an alternative theory. American Journal of Psychology, 39, 10-124. Retrieved from http://www.jstor.org/stable/1415404?seq=1#page_scan_tab_contents

Damasio, A. R. (2001). Fundamental feelings. Nature, 413, 781.ISO/DIS 9241-11. Ergonomics of human-system interaction — Part 11: Usability: Definitions and concepts.

Ekman, P. (1992). An argument for basic emotions. Cognition and Emotion, 6, 169–200. 1992.

Lewis, J. R. (1991). Psychometric evaluation of an after scenario questionnaire for computer usability studies: The ASQ.SIGCHI Bulletin, 23, 78-81. Retrieved from https://www.researchgate.net/publication/230786769_Psychometric_evaluation_of_an_after-scenario_questionnaire_for_computer_usability_studies_The_ASQ

McCarthy, J., & Wright, P. (2004). Technology as Experience. The MIT Press.

Nielsen, J. (1993). Usability engineering. New York: Oxford University Press. Retrieved from https://www.nngroup.com/books/usability-engineering/

Papanek, V. (1985). Design for the Real World: Human Ecology and Social Change. Academy Chicago Publishers.

 Parkinson, B. (1996). Emotions are social. British Journal of Psychology, 87, 663–683. Retrieved from http://onlinelibrary.wiley.com/doi/10.1111/j.2044-8295.1996.tb02615.x/abstract;jsessionid=1B1141E227EB4D393BBBE4E306696882.f01t01

Schachter, S., & Singer, J. E. (1962). Cognitive, social, and physiological determinants of emotional state. Psychological Review, 69, 379-399. Retrieved from https://www.researchgate.net/publication/9090242_Cognitive_Social_and_Physiological_Determinants_of_Emotional_State_In_Psychological_Review_695_379-399

Sergo, P. (2008). Head games: Video controller taps into brain waves. Scientific American, 15(9): 2-11. Retrieved from http://www.scientificamerican.com/article/head-games-video-controller-brain/

Shende, S. (2008). Emotive unveils world’s first brain-controlled video gaming headset. Emotiv Systems. Retrieved from http://www.businesswire.com/news/home/20080220005408/en/Emotiv-Unveils-Worlds-Brain-Controlled-Video-Gaming-Headset  

Tatum, W. (2014). Extraordinary EEG. Neurodiagnostic Journal 54.1: 3–21. 2014. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/24783746

Want help to write your Essay or Assignments? Click here

Cloud Computing in Companies

Cloud Computing
Cloud Computing

A couple of Information technology experts would consent that cloud computing is the most convenient and most dynamic model of message delivery since the birth of internet. Cloud computing has been used by corporate managers and decision makers to help them achieve several financial advantages and protect them from any security threats that could result to tainting of their reputable image and even the longevity of the business itself (Pereira, 2014). In itself, cloud computing is cost-effective and an efficient option to purchasing and sustaining hardware and software in-house. This has been the case since the introduction of Amazon’s S3 (Simple Storage Services) and EC2 (Elastic Compute cloud).

            Cloud computing has had various definitions, having different service providers and the available standard organizations giving different definitions of the same. For instance, the National Institute of standards and Technology (NIST) defines cloud computing as “a model for that enables the ubiquitous, convenient, on-demand network access to a pool of shared configurable computing services that can be speedily provisioned and released with minimal management effort. What cloud computing does, is to avail platforms, software, and infrastructure in form of web services using the internet. However the consumers do not know the specific physical location where these services are provided from.  They therefore purchase the software and hardware as services rather than physical assets (Pereira, 2014).

            Cloud computing is categorized into various classifications. Software, platform, and infrastructure as a service.

            Software as a service moves the separation to the highest level in the stack. Most of cloud services such as web-based email and word processing which are offered for free are in this category (Pereira, 2014). It is beneficial for the users since they do not incur any costs of maintenance.

            Platform as a service boosts the separation slightly higher and is a surrounding where the already developed and deployed applications are made available by cloud providers. The consumers take advantage of the platforms provided such as Java programming language and oracle databases (Pereira, 2014).

            Infrastructure as a service is the minimum class of separation between what consumers desire and the services that are readily available for them to use (Pereira, 2014). Consumers are supplied with raw computing, storage services and networking, this assists them to establish applications and custom services. The cloud provider is in charge of all the physical resources, while the consumer monitors all the other factors mentioned including development tools and operating system (Pereira, 2014).

            Technology alongside its tremendous benefits is also faced with security threats. The prevalent threats include cybercrimes.  There has been however measures that have been put in place to try and alleviate this challenges (Pereira, 2014). Cloud itself can be used as a means of securing data and also disseminate information among others.

            According to (Pereira, 2014) customers ought to get cloud services from the internet. Reports according to (Pereira, 2014) claims that ethics that regulate internet usage and accessibility are used to help clients access the cloud. Through a communication process clients are able to access information and software in the cloud. Correspondence through the cloud is grouped into; outside correspondence, which happens between the cloud and the clients and inner correspondence, which happens inside the cloud’s framework.

Outer correspondence has likenesses with other Internet types of correspondence. Subsequently, according to (Pereira, 2014) the issues experienced by the cloud as a consequence of its Internet highlight are like those of ordinary information and communication technology. These are concerns such as incorporate man-in-the-center, refusal of administration, IP-parodying based flooding, double dealing and listening to private conversations without consent (Pereira, 2014).

Security Challenges

              The general problem is that breaches in security are evident in cloud computing (Jansen, 2011). The specific problem is the increasing costs in the prevention and alleviation of breaches in cloud security. Despite the increased growth in the computing technology, a great danger of losing data due to cybercrimes and other related threats exists.  Efforts have been made to curb the problem of security issues (Younis, 2013). 

One of these measures includes the use of cloud as a tool to secure data and share information among others. Moreover, the insecurity of cloud computing has recently upsurged (Han, 2013). The paper looks into security vulnerabilities in cloud computing and recent solutions posted by different scholars are discussed.

          The cloud’s security risks may vary from those of the infrastructure of traditional IT either in intensity, nature, or even both.  Several works have been put in place to monitor the cloud’s security challenges from the service model perspective.  The identified challenges are classified according to the following domains; communication issues, contractual and legal issues.

Communication Level Challenges

            Normally, the services of the cloud are accessed by customers through the Internet (Younis, 2013). Subashini (2011) reports that standard Internet protocols and mechanisms aid in facilitating the communication between the cloud and customers. The process of communication leads to conveyance of either applications or information/data between the cloud and the customers. Virtual Machines (VMs) of the cloud are also in constant communication with each other.  Communication through the cloud is classified into; external communication, which occurs between the cloud and the customers and internal communication, which takes place within the cloud’s infrastructure.

            External communication has similarities with other Internet forms of communication. Consequently, the problems encountered by the cloud as a result of its Internet feature are similar to those of conventional IT communication (Rong, 2013). These issues include man-in-the-middle, denial-of-service, IP-spoofing based flooding, eavesdropping, and deception (Jansen, 2011).

Contractual and Legal Level Challenges

         According to Rong (2013), implementation of cloud computing requires applications and data to be migrated by institutions to the administrative control of Cloud Service Providers (CSP). As a result, several issues crop up such as geographic jurisdictions, performance assurance, compliance with regulatory laws, contract enforcement monitoring etc. These issues are associated with the legalities, physical locations of the data, and service level agreement (SLA).

         Cloud computing is an involving term that (Pereira, 2014) defines a cloud computing as an inclusion of software and system software in the datacenterthat satisfies these services. Some of the exceptional features of cloud computing include the illusion of different hardware resources and the ability to pay for the required resources. In a unique way, (Han, 2013) states that some of the features of cloud computing include use of a wide network access, large resource pool, dynamic elasticity, on-demand self-service, and also the some of the metered services that are similar to utility.

             In general, cloud computing uses three main service models (Mell, 2011). The first service is software as a service (SaaS), which is software, distributed based representation in which applications are hosted by a service/vendor provider. The models are accessed by customers or the end users via the network, usually the internet. This model some scholars like (Sebesta, 2012) states that it is associated with the pass-as-you-go subscription licensing model. The second computing model is a platform as a service (PaaS) that allows customers to manage the hosting environment (Chen, 2012, March). The third model is the infrastructure as a service (IaaS).

In this model, the customers control everything apart from the datacenter infrastructure (Sebesta, 2012).

Conclusion

Cloud computing is an emergent phenomenon which is set to revolutionize the use of the internet. Nevertheless caution is a needed about it. Moreover, these new technology has made human lives easier, but people need to be careful in understanding the security risks and challenges involved when using these technologies. In this research paper, the core focus of highlighting the challenges, insecurity issues that arise as a result of cloud computing. In summary, the cloud computing has the capacity of becoming a pioneer in the promotion of a safe, virtual and economical solution in the future.

References

Amazon. (2012, April). Amazon web services blog: Amazon S3 – 905 billion objects and

650,000 requests/second. Retrieved from http://aws.typepad.com/aws/2012/04/amazon-s3-905-billion-objects-and-650000-requestssecond.html

Chen, D., & Zhao, H. (2012). Data security and privacy protection issues in cloud computing.In Computer Science and Electronics Engineering (ICCSEE), 2012 International Conference on (Vol. 1, pp. 647-651).IEEE.

Chen, Y., Paxson, V., & Katz, R. H. (2010). What’s new about cloud computing security.University of California, Berkeley Report No. UCB/EECS-2010-5 January, 20(2010), 2010-5.

Chen, Y., Paxson, V., & Katz, R. H. (2010). What’s new about cloud computing security.University of California, Berkeley Report No. UCB/EECS-2010-5 January, 20(2010), 2010-5.

Dikaiakos, M. D., Katsaros, D., Mehra, P., Pallis, G., & Vakali, A. (2009).Cloud Computing: Distributed InternetComputing for IT and ScientificResearch.Internet Computing, IEEE, 13(5), 10-13.

Han, Y. (2013). On the clouds: A new way of computing.Information Technology and Libraries, 29(2), 87-92.

Jansen, W. (2011, January). Cloud hooks Security and privacy issues in cloud computing. In System Sciences (HICSS), 2011 44th Hawaii International Conference on (pp. 1-10).IEEE.

Kuo, A. M. H. (2011). Opportunities and challenges of cloud computing to improve health care services.Journal of medical Internet research, 13(3).

Mell, P., &Grance, T. (2011). The NIST definition of cloud computing.

Pereira, A., Machado, R. J., Fernandes, J. E., Teixeira, J., Santos, N., & Lima, A. (2014).Using the NIST reference model for refining logical architectures. In Computational Science and Its Applications–ICCSA 2014 (pp. 185-199). Springer International Publishing.

Ren, K., Wang, C., & Wang, Q. (2012). Security challenges for the public cloud. IEEE Internet Computing, (1), 69-73.

Rong, C., Nguyen, S. T., & Jaatun, M. G. (2013). Beyond lightning: A survey on security challenges in cloud computing. Computers & Electrical Engineering, 39(1), 47-54.

Ryan, M. D. (2013). Cloud computing security: The scientific challenge, and a survey of solutions.Journal of Systems and Software, 86(9), 2263-2268.

Sebesta, D. J., Schmidt, S., Westerinen, W. J., & Carpenter, T. (2012). U.S. Patent No. 8,161,532. Washington, DC: U.S. Patent and Trademark Office.

SO, K. (2011). Cloud computing security issues and challenges. International Journal of Computer Networks, 3(5).

Younis, M. Y. A., &Kifayat, K. (2013).Secure cloud computing for critical infrastructure: A survey.Liverpool John Moores University, United Kingdom, Tech. Rep.

Want help to write your Essay or Assignments? Click here

Technological Advancements

Technological Advancements
Technological Advancements

Technological Advancements

In recent times, technological advancements have been seen to have a massive impact on the healthcare sector in the USA. The healthcare system in the USA is faced with various challenges despite the major improvements implemented in the sector. The shortfalls in efficiency and quality care have resulted in the development of different perspectives regarding the healthcare system in the USA. 

One of the historical perspectives is that there are still disparities in health care as patients are not fully satisfied with the care received despite the huge expenditure in the evolving health sector (Piscotty et al, 2015). There have been discussions regarding insurance coverage, poor quality, and escalating quality. Another historical perspective of the healthcare system in the USA is that it has some of the best professionals in the world, their system is overspecialized but inequitable and neglects preventative and primary care.

The different perspectives touch on the healthcare system. The technological advances would shape perspectives that have been there for long. Information technology has shaped the first perspective positively by enhancing better ways of ensuring efficiency in the delivery of care and addressing disparities (Piscotty et al, 2015).The new medicine, new machines, and treatment methods, as well as integration of electronic health records into the system, has improved the delivery of care thus shaping the perspective to a better one.

However, poor development of technology leads to lower satisfaction rates as the people expect improvements to be stimulated by innovations which require technological advancement. The advancement in technology ensures that the USA has a better healthcare system compared to other countries. The better integration of the advancement would shape the perspective for the better eliminating the issues of neglecting and inequity since care would be patient centered (Piscotty et al, 2015).

The low development in technology affects the perspective by making it evolve over time where the health system is regarded to be inefficient. The equitable primary care and concentration on it while adopting technologic advancement is essential for improvements in the healthcare systems while low development results to negative impacts.

There are various technological advancements that have positively impacted the healthcare system in the USA by revolutionizing it for the better. The electronic health records in the health systems in the USA is one of the major technological advancement. Through the technological advancement, the systems used in the healthcare centers has been successfully integrated into a single platform.

Previously there were disparate systems, but EHR has enabled a more structured platform thus enhancing a more efficient and integrated type of care for the patients (Piscotty et al, 2015). The advancement has impacted healthcare delivery by ensuring consistency in the medicine and treatment provided through efficiencies and centralization of patient’s information thus leading to desired outcomes. Another technological advancement is the new surgical procedures e.g.  Anesthesia used in the healthcare system.

It contrasts from the electronic health records which is an advancement involving support system while anesthesia involves treatment and surgical procedures. The advancement involves the use of anesthetic practices and agents reducing patient’s burden of surgery. The advancement has led to positive impacts such as reduced medical errors and better surgery outcomes for the patients (Piscotty et al, 2015).

These technological advancements have influenced the opinion of the public in the USA while considering the changes that they have brought in the healthcare system. The surgery improvements and use of anesthetics have led to the positive opinion of the patients regarding the advancement. The patients have rated the improvement highly since it has ensured reduction of medical errors and stimulating quicker recovery which is regarded as the major improvement in healthcare delivery (Piscotty et al, 2015).

Positive opinions are also given by the public since they experience less suffering due to the effectiveness of procedures by health care professionals. One of the negative opinions is that despite the improvements through the advancement, the greater revolution in the medical field of surgery has made patients incur higher costs for the better services received. The patients have also given positive opinions regarding the incorporation of the electronic health records in the healthcare system.

The public is of the opinion that the advancement has consistency in health systems in the way they handle their health information thus resulting in better outcomes. The public has regarded the advancement as more satisfactory since through the system more coordination of their treatment and provision of medicine is made possible. The failure by some health organizations to implement the advancement and also to have a structured platforms in place makes the public regard the change as time-consuming and complicated which slows down the delivery of care in the health sector (Piscotty et al, 2015).

Technological advancements are crucial in impacting health care utilization and delivery. The electronic health records technological advancement would be essential in the centralization of information and might also be used in future as a population and data health tool. Such changes would enhance more efficient and consistent strategies of care delivery eliminating the many challenges faced by failure to integrate patient’s care effectively (Piscotty et al, 2015).

The advancements in aesthetics as part of the surgical improvements would impact the delivery of care through reduced number of medical errors, reduced hospital stays, and readmissions which are an improvement and positive impact in healthcare delivery. The advancement such us of aesthetic practices and agents leads to a greater utilization as compared to the older techniques. The substantial innovations in the advancement over years also leads to higher utilization in the healthcare systems in the USA.

The use of the electronic health records would affect the utilization of health care by eliminating the unnecessary medical procedures and duplicate tests thus lowering utilization in the healthcare system. Rather than the technological advancements, other factors that might affect utilization in health care are insurance, education, the need for the medical services and social networking issues (Piscotty et al, 2015). In a nutshell, the effective adoption and integration of the technological advancements in the healthcare has led to more positive changes in the health system.

Reference

Piscotty, R. J., Kalisch, B., & Gracey-Thomas, A. (2015). Impact of Healthcare Information Technology on Nursing Practice. Journal Of Nursing Scholarship, 47(4), 287-293. doi:10.1111/jnu.12138

Want help to write your Essay or Assignments? Click here

Defending the spirit of the web

Defending the spirit of the web
Defending the spirit of the web

Defending the spirit of the web: Conflicts in the internet standards process

Introduction

The adaption and creation of principles is frequently replica of a game. In other words, the standards process is the replica of the activities of a formulated actor in the networks. The paper focuses on the development of web services choreography standards. It details the history and explains the technological arguments surrounding the standards. It entails a model of standardization, and look at a previous case from three perspectives. The first perspective is social- it follows the people involved in the standard process, an approximation of more complex approaches. The second is cultural- it looks at the ideas of the participants, particular ideas about technology (Edrei, 2016). The third is economic.

These three approaches are intertwined. Economic theories are important in explaining how individuals with a given set of utilities will interact. On the other hand, Economic theories are insignificant at determining where the utility comes from, but social theories are better for understanding how individual’s utility gets formed. The social research allows building and calibrating a better decision model.

Literature Review

  1. The social Perspective: Following the People

The standard process of constructing the web service composition consists of two decision process, development, and adoption. In the development process, the participants in a standards body create and debate the standard. A normal document is created when the official report goes well, which is frequently improved until it is introduced to a wider community for acceptance (Nickerson, & Zur Muehlen, 2013,).

Web services choreography describes the coordination of long-running transaction between business partners using usual internet protocols. It can be used in a diversity of domains, extending from supply chain management to media content solicitation. The foundations of web services choreography can be located in workflow management technology, which has been available since the middle of the 1980s.

In early 1990s large workflow users became aware of the possibility that they would be approved for the existence of several workflow solutions from different vendors. Standards were first created within the scope of workflow management coalition, but are increasing being defined by competing for uniformity groups, such as Business Process Management Initiative (BPMI), Organization for the Advancement of Structured Information Standard (OASIS), and World Wide Web Consortium (W3C).

In parallel to this growth, the use of Internet technology for application combination became feasible through the introduction of value-added models on top of the inevitable Hypertext Transport Protocol (HTTP) protocol used for the World Wide Web. These standards allow application designers to open their request for access to the internet. However, some applications give help for simple request-response message exchanges. More robust mechanisms are needed for the coordination of long-running transactions, such as the following exchange of Quotes, Orders, and Delivery Notes.

The web service compositions are not moving by themselves it is the participants in the choreography process who are packing up, leaving, and reassembling with a different standards organization. There are a finite number of people active in the composition process, and as a group loses momentum, some of the members will move to a livelier group.

Participants in the standards process are looking for a venue in which they could offer a standard that is technically excellent. When they are not comfortable in the group they are foundering, they jump to another group. From this perspective of the movement pattern of people leaving a group to another, it is essential to look the ideas.

  • The cultural perspective: Following the ideas

From the movements of the standards participants, the main reason for shifting group is disenchantment with the development of the business in the group. And this can be described as a growing sense that the actual standard is becoming too involved. For anyone quitting a conventional process because of the complexity of the conventional process does not appear to fit into a game-theoretic model, where the players are described as seeking to dominate each other for the fulfillment of the interests of their sponsoring corporation. Instead, they are engaged in something that looks like a process of aesthetic evaluation of the standard.

An aesthetic judgment can contribute to a strong value system. For instance, graphic designers of the modern school are taught to avoid that which does not contribute to the meaning of a page. And also programmers are also taught about clean versus kludgy code. The standard is explicitly disapproving of a potential use of Simple Object Access Protocol (SOAP) because it is counter to the spirit of the web.

It violates the design aesthetic of those who have built the web protocols. The phrase “the spirit of the web” is interesting from two perspectives. It is closer to an aesthetic than a rule; this is because there are myriad different ways to define a protocol and a myriad. Secondly, it suggests that the web has a spirit and this spirit is to be defended.

Most of standards have grown through Request for Comment (RFC) process, and it is described as follows: easily understood documentation, technical excellence; prior implementation and testing; clear, concise, and openness and fairness; and timeliness.

The participants in the development of web services are influenced by both the norms of the corporations they serve and the beliefs of the technical community they consider themselves part of it. For example, open source develops source while working on the job, sometimes without the employer knowing (Fielding, Roy Thomas, 2013). They define their identity in the hacker community. This may be the case that regular participants are similar.

  • The economic perspective: Following the Money

It is clear that a lot about standards can be learned by evaluating the potential benefits to the players involved in decision making. The vendors want to make money on standards, and by making the standard more complex, they increase the chance of selling products (Swenson, 2013). A programmer’s utility can be evaluated when a different criterion is looked than money. The development stage can be described as a stage of the collective invention, and as a part of this invention, new ideas are continually evaluated.

The dialogues captured in standard group’s discussions often concern the weighing of different attributes. It can be predicted that different groups would have different weights in mind for a similar set of attributes. In standard groups labor is voluntary, and switching costs are low, so quitting one standards body and reforming in another is a viable option.

In modeling, the standards creation process, one possible result of an impasse might be the migration to a different standards committee. And in methodological –the social perspective helps inform the economic perspective. The bylaws of the groups might determine the allowable jumps so that the movement between groups might be less random than it appears.

Vendors would want to serve on multiple standards committees so that they don’t report late if one takes off. Also, users might have little motivation to adopt one of these standards if the head does not move.

In conditions of high uncertainty, waiting might be the best strategy. It can be seen in web services choreography that there is an absence of user adoption and participation by vendors on multiple committees. Observation of different standards and their participants’ standards can be categorized as being driven by three different groups. The first two groups are self-evident.

Some standards are clearly driven by vendors, and some standards are clearly driven by users, for instance, Rosetta Nett is driven by a set of companies in the manufacturing industry. But there is a third set of specification such as HTTP where representatives of corporations are involved, but the standards do not drive by corporations. Those are referred as research-driven. Often, those engaged in the creation are financed by government research funding organizations (Kaman et al., 2014).

The representatives of corporations involved in these standards are often individuals who maintain a strong link with the research community. And the standards are sometimes created in standards groups that are strongly identified with the research community. In the understanding of the economies of standards development, It is essential to look at the funding sources and furthermore the sympathies of those on the committees.

Most software vendors are pushing for SOAP-based coordination standards in contrast. In contrast, the dispute between different standards for railroad track widths was resolved through the wishes of an important customer, the federal government. In rare cases, customers can overcome the wishes of vendors. Vendors often use the standardization process toward their ends and, in game-theoretic terms. The most accusation against vendors for conspiring to sabotage standards might be correct. There is always tension between the proposals of research-oriented participants and those who conscientiously represent the interest of their sponsoring firms.

The standard process is complex, and multiple perspectives, applying both social and economic techniques are more likely to yield insights than unique techniques. Future research might suggest ways of preserving or improving the overall functional landscape of official bodies.

References

Edrei, A. (2016). Divine spirit and physical power: Rabbi Shlomo Goren and the military ethic of the Israel defense forces. Theoretical Inquiries in Law7(1), 255-297.

Fielding, Roy Thomas (2013). Architectural Styles and the Design of Network-based Software Architectures. Department of Computer Science. Irvine, CA, University of California, Irvine, CA: 180.

https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwjh_vS30InSAhULIMAKHQWACRgQFggeMAA&url=http%3A%2F%2Fweb.stevens.edu%2Fjnickerson%2FSpiritOfTheWeb.pdf&usg=AFQjCNFJuVNNnMmPoYa5Vq6j7pB_qMZrXQ&sig2=pvAutCWsi777VOpchIBENQ

Kazman, Rick, Jai Asundi and Mark Klein (2014). Quantifying the Costs and Benefits of Architectural Decisions. ICSE 2001.

Nickerson, J. V., & Zur Muehlen, M. (2013, December). Defending the spirit of the web: Conflicts in the internet standards process. In Proceedings of the Workshop on Standard Making (pp. 56-69).

Swenson, Keith D. (2013). Personal Communication. M. zur Muehlen. Hoboken, NJ. West, Joel (2013). How open is open enough? Melding proprietary and open source platform strategies. Research Policy 32: 1259-1285.

Want help to write your Essay or Assignments? Click here

Emerging Cyber Security Approaches

Emerging Cyber Security Approaches
Emerging Cyber Security Approaches

Emerging Cyber Security Approaches

Introduction

People have become more dependent on information systems over the past few years. Even as societies are increasingly embracing open networks, the demand for new cyber security measures grows each day. Emerging trends such as internet of things (Iota) have further prompted governments, businesses and institutions to seek advanced ways of tackling cyber threats. 

One of the reasons that make it difficult to handle these threats is the occurrence of global laws that require investigation of facts across international borders. This has increased the complexity of networking infrastructure resulting to demand of new approaches to address cyber threats. There several existing cyber security strategies being used by organizations, and while their monitoring system can help secure information, they are hardly enough to challenge today’s external and persistent problems efficiently.

Current methods include facial recognition and finger biometrics that have seen a wide range of applications and usage. However, these approaches need to be updated and highly sophisticated to address emerging threats and issues. However, there are new approaches being researched and tested to address cyber security issues such as the use of random numbers, remote agent technologies, real-time forensic analysis, and smart card technologies. 

Although these technologies have not been widely accepted and have received insufficient pragmatically application, they can offer strategic solutions to the issue of cyber security when well researched.

Use of random numbers

Scientists at the University of Texas at Austin have come up with a masterpiece project on cryptography where new methods for producing random numbers could be used to encrypt data (Bowie, 2016). At the core of the random number technology is David Zuckerman, UT scientist and Eshan Chattopadhyay, a computer science graduate. The two scientists developed a theorem that lays the foundation for new methods of producing random numbers that require little computer efforts (Ransom, 2016). 

The idea behind the technology is to create a random application that creates sequences which do not obey a particular pattern and thus cannot be predicted.  Scientists use randomness to create algorithms and model complexities. Some of the most general uses of randomness are measuring demographic changes in the economy and electoral polling.

In their draft -paper titled Explicit Two-Source Extractors and Resilient Functions, the scientists state that their efforts are “motivated by the ubiquitous use of randomness in various branches of computer science like algorithms, cryptography, and more”(Zuckerman & Chattopadhyay 2016, pg. 2).

According to Ransom (2016), extractors are functions that output and distribute random numbers randomly that vary from the input source. Most applications require truly random, uncorrelated bits, but most easily obtainable sources of randomness do not satisfy these conditions. The study of randomness extraction pays attention to nearly uniform numbers from sources that are only weakly random.

Government Role

The federal government faces a lot of cyber security threat that has the potential of jeopardizing infrastructure, economy, and freedom of citizens. The failure to actively seek new and sophisticated actions against cyber threats can have multidimensional effects. Furthermore, nations and states such as Russia, China, and Iran have expressed willingness to grab US digital data for their gains.

For these reasons, the federal government is at the forefront of supporting emergent cyber security technologies such as the Explicit Two-Source Extractors and Resilient Functions based on random numbers. The government has supported these researchers indirectly  through federal aid, provision of sufficient resources and infrastructure and provision of a free atmosphere to carry our scientific experiments freely.

Additionally, the government has the developed of viable cyber security liability and insurance system, coupled with the advocacy for more private sector efforts to enhance general awareness and ratings.

 Real examples

According to Zuckerman (2016), high-quality randomness can help keep online channels more secure. People need randomness when making online transactions to protect information from hackers. For example, buying something from online sites like Amazon will require randomness functionality. While there are current randomness technologies in play, use of the emergent Explicit Two-Source Extractors and Resilient Functions can offer more secure and unpredictable random patterns.

The masterpiece cryptography can be used to encrypt data, make electronic payments more secret while providing more accurate statistically significant polls.  The emerging technology creates truly random numbers with minimal computations than existing methods. This will go a long way in facilitating high levels of security for consumer credit cards and military communication…

References

Tarala, J. (2011, Feb). A real-time approach to continuous monitoring. Retrieved from http://www.sans.org/reading-room/analysts-program/netwitness-splunk-monitoring

Tarala, James. (2011).  A Real-Time Approach to Continuous Monitoring. SANS White Paper.

Tiwari, V., Shailendra, G., Tiwari, R., & K, M. (2010). Computational analysis of .net remoting and mobile agent in distributed environment. JOURNAL OF COMPUTING, 2(6), 34-39. Retrieved from http://arxiv.org/ftp/arxiv/papers/1006/1006.4538.pdf

Want help to write your Essay or Assignments? Click here