Monday, April 13, 2015

Hybrid cloud computing technology

Hybrid cloud is infrastructure, which includes links between managed by the user Cloud (usually called "special") and at least a cloud managed by a third party cloud (usually called "public cloud"). Although the public and private sectors of the cloud, hybrids are linked together, they remain unique entities of its kind. This allows the hybrid cloud to provide the benefits of the deployment of several at one time Templates. Hybrid clouds vary greatly in sophistication. For example, some hybrid clouds offer only connection between the site and public clouds. All the difficulties inherent in two different infrastructure is the responsibility of operations and application teams.

Most of the mixed material is concentrated clouds, critics and infrastructure specialists. However, the company has affected parts of some of the most essential parts of a quantity of hybrid cloud: the abstraction layer for applications - when avoided, should not be relied infrastructure applications. Containers for Linux and Windows are the main means of extraction application infrastructure. It can also be designed to play a part of the resources allocated to reduce or eliminate the use of the controller amount of platform. When there is no API to use in conjunction with a third party criteria containers such as port workers and missiles, and applications are also the technological capacity of the PAA. Dissemination of political orientation - organizations must be able to deploy workloads based on programming policy. Part of the workload in the public cloud will be sent on the basis of risk and the development of the theater, while others will be posted internally. The application can also move back and forth depending on the desired results. For example, an organization may choose to use public cloud services for archiving its data, but it remains to keep the details of service to customers at home. It should be the policy of some developers access on request to multiple cloud computing services, such as data storage and selection public cloud providers and internal operations - catalog of services that focus on. Should be allowed to divide the specific development in the self-service bulletin. Application / Microservices support existing and new - must be new and the environment will not be exclusive to the new Greenfield operations and applications, but should not be excluded. Existing applications can and should take advantage of the scalability, flexibility, and other distinctive characteristics of the cloud. You must be new applications with advanced features of the cloud will be easier to build. You should start and other features help create the platform intelligent endpoint that can be used dumb pipes policies, the main objective of microservices
The idea behind the hybrid cloud is that companies can use to take advantage of the cost effectiveness and scalability offered by public cloud environment without critical applications and data weaknesses associated with the sensor that public cloud option. In addition, the hybrid cloud model creates is often the best most effective for different types of data can be transferred solutions to provide the platform for the most effective and safe environment.
Usually create a hybrid cloud in one of two ways: either the seller with the private cloud based solution provider is a partnership with the public cloud or public cloud provider and a partnership with the provider that offers private cloud platforms .
Can that many organizations reap great benefits of hybrid cloud option. For example, the company may want to use a SaaS application, but not concerned about security risks to a third party provider, such as Apprenda, and create a private cloud within the company for its firewall. Or a company that offers services tailored to different verticals can choose to use the public cloud to interact with their customers, but the data remains in the safe environment of a private cloud.

Wednesday, April 8, 2015

The new cloud storage for mobile technology


Mobile cloud storage is a form of cloud storage that can be accessed on mobile devices such as laptops, tablets and smartphones. Mobile cloud storage providers to provide services that allow users to create and organize files and folders, music, pictures, similar to the other police cloud computing. And services used by individuals and businesses. Most cloud file storage providers offer free unlimited use and is responsible
for extra storage once you go over the limit for free.
Usually these costs be downloaded as a monthly subscription and have different prices depending on the amount of storage required.

In 2009, revenue from cloud services was approximately $ 58.6 billion in 2011 and had approximately $ 2.6 trillion. Gartner predicts that [when?] This turnover will be high 152000000000 dollars in 2014.

Some mobile device manufacturers with applications of storage products for mobile cloud. These applications facilitate the user to synchronize files across multiple platforms. Part of the process of creating a new mobile devices often include the formation of cloud storage services to back up files on the device and information. Apple iOS devices are pre-loaded and configured to use a cloud service phone shop mobile Apple iCloud. Google offers similar functionality with the Android operating system through the support device using Google Drive account. Samsung Galaxy smartphone in partnership with Dropbox, still Microsoft Microsoft SkyDrive for progress.

Most companies that offer storage sites and secure cloud access files, allowing the use of any device that can browse the Internet.

Friday, March 27, 2015

Insurance data center, not to lose money and information

Insurance for the data center facility used to house computer systems and associated components, such as telecommunications and storage systems. Generally includes redundant or backup power, redundant data communications connections, environmental controls (eg, air conditioning, fire suppression) and the various safety devices. Large data centers are operations on an industrial scale using a larger amount of energy in a small town.

Requirements for insurance  modern data centers

IT operations is the most important aspect of the organization's activities worldwide. One of the main concerns is business continuity. Companies rely on their information systems to run their operations. If the system became available, may be reduced or ceased operations company completely. It is necessary to provide a reliable infrastructure for IT operations, to minimize any risk of crashes. Information security is a concern,
which is why the data center must provide a safe environment that reduces the chances of a security breach. Data center must maintain high standards to ensure the safety functions of the computer environment hosted there. This is accomplished through repetition of mechanical cooling and electrical systems (including energy generators in emergencies) data center along the fiber optic cable service.
Identifies infrastructure TIA-942 Unified Communications of the Association of the telecommunications industry for data centers, the minimum requirements for the centers of communications infrastructure data and the data center, including data centers, hosting centers and one multi-tenant company and Internet tenant data. It aims topology proposed in this document apply to any size data center.
Telcordia GR-3160 NEBS requirements for equipment provides communications and data center spaces, and guidelines for data centers in the areas of telecommunications networks, and environmental requirements for the equipment to be installed in these spaces. These standards have been developed jointly by Telcordia and industry representatives. And can be applied to areas of data centers with equipment data or information processing technology (IT). The equipment can be used for: operation and management of the communications network to provide carrier-based applications directly on the data center to the customer by the carrier to provide hosted applications to third parties to provide services to its customers to provide a combination of these and similar data center applications
The efficient operation of the data center requires a balanced investment in both the installation and equipment shelters. The first step is to create the appropriate reference center for equipment installation environment. Tawhid and stereotypes can generate savings and efficiency in the design and construction of communications data centers.
Normalization means building integrated and engineering equipment. Module has the advantages of portability and ease of growth, even though the expectations of planning is not the best. For these reasons, should be planned data centers in frequent communications equipment building blocks, strength and support (conditioning) equipment when associated practices. The use of a dedicated centralized systems require more accuracy to the needs of future expectations to prevent expensive in construction, or perhaps even worse - under construction fails to meet future needs.
Data "lights-out", also known as the center of a dark or dark data center is almost an ideal data center and eliminate the need for direct access by employees, except in exceptional circumstances. Due to the lack of need for staff to enter the data center, and can be used without lights. All devices can be accessed and managed by remote systems with automation software used to implement unattended. In addition to saving energy, and low labor costs and the ability to determine the location in addition to population centers, and implementation of data center extinguishing the lights reduces the risk of malicious attacks on infrastructure.
There is a tendency to update the data centers to take advantage of the performance and energy efficiency of the equipment increases the capabilities of the latest computer, such as cloud computing. As this process is known data processing center.
Organizations that are experiencing rapid growth, but their data centers, information technology era. Industry research firm International Data Corporation (IDC) puts the average age of the data center at the age of nine.  Gartner, says other research data center company of more than seven years outdated.
In May 2011, he said, uptime Institute Research Organization Data Center found that 36 percent of large IT companies surveyed expect to exhaust the capacity in the next 18 months.
Transformation of the data center is a step-by-step approach through integrated projects with the passage of time. This differs from the traditional method of data upgrades that take the serial approach and fragmented center.  and include pilot projects within the project of transformation of data unification / consolidation, virtualization, automation and security center. Standardization / unification: The purpose of this project is to reduce the number of data centers can be a large organization has. This project will also help to reduce the number of hardware and software platforms, tools and processes in the data center. Organizations replace aging data center equipment in accordance with those that provide the capacity and performance of the latest. And standardized computer platforms, networks and management so that they are easier to manage.  Virtualization: There is a tendency to use desktop virtualization technologies to replace or enhance multiple devices from the data center, such as servers. Default reduces capital and operating expenses,  and reduce energy consumption. Used virtualization technologies also to create virtual desktops, which can then be hosted in data centers and rented on a subscription basis.  Data published by Lazard Capital Markets Investment Bank reports that 48 percent of the company's operations will be apparent in 2012. Gartner default Show as a catalyst for the update.  Implementation: automating the data center by automating tasks such as saving, configuration, patching, and version management, and compliance. While the companies suffer from low skilled IT workers,  automate tasks are data centers operate more efficiently. Security: In modern data centers, data security has been the integration of the virtual security systems with existing physical infrastructure.  The security of modern data center should consider physical security, network security and data integrity and user ..

Wednesday, March 25, 2015

Cloud Game technology Cover

Game on demand, sometimes called the games on demand, is a kind of online games. Currently, there are two main types of clouds games: Games video game based on the cloud and streaming file based on the cloud. It aims to provide cloud gaming to end users less friction and live ability to play games across various devices.

Cloud types of games


Game on demand is a term used to describe the shape of online game distribution. The most common games Video Cloud currently methods (or pixels) and the flow of the stream file.
Video

"Game on demand", also called "games on demand" is a kind of online games that allow live feeds and custom games on computers and mobile devices, similar to video on demand, thanks to the using a thin client. Are stored in the actual game, performed, and provided that the operator on the server company or remote game flowing video results directly on consumer computers across the Internet.  This allows access to games without the need for a controller and makes a great capacity of the user's computer is to a large extent, the server is a system that performs the processing needs.   and move the controls and pressing the button the user directly to the server, where they are registered, and the server sends back to the game controls the input response.

Companies that use this type of cloud gaming include PlayGiga, CiiNOW, Ubitus, Playcast Media Systems, Gaikai and OnLive.

Games on demand service is a game that takes advantage of a broadband connection, a large server clusters, encryption and compression to stream to a subscriber game content. Users can play games without downloading or installing the actual game. Is not the game content is stored on the hard drive occurs playing hard to implement the user code in the first place to the cluster server, so the subscriber can use the computer less energy play the game the game usually requires from the server does all the heavy performance processes that are usually performed by the computer of the user The final  Most gaming platforms clouds and closed property ; not published the first cloud platform open source games until April, 2013. 
stream file

Cloud games using streaming files, also known as progressive download, thin client, which is published in the actual game running on your gaming device as a mobile device or a computer or console of the user. Small part of the game, usually less than 5% of the total size of the game are loaded at the beginning so that the player can start playing quickly. The game does the rest of the content to the end user device while playing. This allows instant access to games with low-bandwidth connections to the Internet without delay. Cloud is used to provide a means for those who are subject to the flow of the game content and analyzing large amounts of data.

Games based on cloud streaming file requires a device that has the capabilities of the hardware to run the game. Often content, the game is downloaded is stored on the end user device, where it was stored temporarily.

Companies that use this type of cloud gaming include Kalydo, Approxy and SpawnApps.

Tuesday, March 24, 2015

Information Service and Cloud Dedicated Hosting


Dedicated hosting service, dedicated server, or managed hosting service is a type of internet hosting the client leases an entire server not shared with anyone. It is more flexible than shared hosting, and organizations have full control over the server (s), including choice of operating system, hardware, etc. There is also another level of dedicated or managed hosting managed hosting is called complex. Apply complex managed hosting on physical servers, hybrid server and virtual servers with many companies choose hybrid (a combination of physical and virtual) hosting solution. There are many similarities between the standard and complex managed hosting, but the main difference is the level of administrative support and engineering the customer pays for all because of the growing size and complexity of infrastructure deployment
. Steps presented to support most of the administration, including security, memory, storage, and support of information technology. Proactive service in the first place in nature. Usually can supply [1] Server is managed by a hosting company as an added service. In some cases, a dedicated server can offer less overhead and more on the return on investment. Often includes servers in data centers, similar to owned facilities, providing redundant power sources and HVAC systems. However property, which is owned by the server hardware vendor, in some cases, which provide support for operating systems or applications. [Citation needed]
Use the dedicated hosting offers high performance service, security and stability benefits of e-mail, and control. Because of the relatively high cost of dedicated hosting, and is mostly used by sites that receive large amounts of traffic.

Bandwidth and communication

Denotes transfer rate of data bandwidth or the amount of data that can be transferred from one point to another in a given period of time (usually two), and are often represented in bits (data) per second (bit / s). For example, visitors to your server, a website or applications to take advantage of the bandwidth * Third - total transport (measured in bytes transferred)

95 percentile method


Line speed, and described the 95 percentile, refers to the speed at which the data stream from the server or device, measuring 5 minutes for this month, dropping the top 5% of the measures that are more high, based on the use of the following month to the highest extent. This is similar to the measurement of the average, which can be considered as a percentile of 50 gauge (50% of the above steps, and 50% of the measurements below), then it puts the bits in the 95 percentile with 5% of the above measures value, and 95% of measurements below the value. It is also known as the Burstable bills. The transmission rate is measured in bits per second (or kilobits per second, or Mbps Gigabit per second).

Unlimited way


Measure the width of the second frequency range is unlimited, where the roof or control of the speed server service "high end" suppliers. High-speed line to the measured bandwidth is not the sum of Mbit / s allocated to the server and configured at the transformation. For example, if you buy a 10 Mbit / s unlimited bandwidth, online top speed is 10 Mbit / s. At 10 Mbit / s leads to the provider to control the speed of the transfer will take place, while providing the opportunity for the owner of a dedicated server to avoid being accused of excess bandwidth. Unlimited bandwidth services usually incur additional charges.

Total transfer method


Some providers calculate the total transfer, which is a measure of the actual outgoing and incoming data, measured in bytes. Although it is generally the sum of all traffic to and from the server, a certain degree of traffic providers only (password from the Internet server).

bandwidth aggregation


This is a key mechanism to accommodate buyers determine which provider to provide the right pricing mechanism for bandwidth price band. [According to who?] The prize package hosting providers over dedicated bandwidth with a monthly fee for a dedicated server. Let us illustrate this with the help of an example. This average $ 100 per server from one of the common providers offer customized domain with 2 TB of bandwidth. Suppose you bought 10 servers, you will have the opportunity to consume 2 TB of bandwidth per server. However, suppose your application structure is given only two of the 10 web servers really face while using the rest to store, search, data or other internal functions of the base, then supplier that allows compile bandwidth allows you to consume the year 20 TB of incoming or outgoing bandwidth, or both, depending on their policy. The assembly, which does not provide the bandwidth you can use only four terabytes of bandwidth provider, will the rest of the 16 TB of bandwidth will be unusable in practice. This fact is known by all hosting service providers and hosting providers to reduce costs by offering the amount of bandwidth that will be used frequently. This is known as the increase in sales, and enables customers to higher bandwidth to use more of the group can provide otherwise, because they know that this will be offset by customers who use less than the maximum allowed.
One reason for the choice of outsourcing household sources is the availability of high power many institutions networks. As a dedicated server providers take advantage of the huge amounts of bandwidth, they are able to get lower prices on the basis of the size to include several combination providers of bandwidth. To get the same type of multi-vendor network without combination of bandwidth, it will be a great investment in routers heart, long-term contracts, monthly bills and expensive, you need to be in place. Necessary to develop a network without a bandwidth multi-vendor mixture is not economically significant to accommodate spending services.
And include many suppliers server service level agreement-based network dedicated arrive on time. Some providers dedicated server hosting services offer 100% of the time on their network. By securing multiple vendors to connect to and use of redundant hardware of necessity, and suppliers are able to ensure the highest up-time; usually between 99 to 100% up time and they are the highest quality provider. One aspect is the highest quality providers are more likely to be up to the quality link across multiple service providers, which in turn provides a large redundancy in the event a multiple drops store in addition to improving possible routes to destinations.
Bandwidth consumption has become in recent years in the use of the model to the use Mbps per gigabyte. A measured bandwidth traditionally access line, which included the possibility of buying the necessary Mbps certain monthly cost of speed. Shared hosting model developed, and the tendency to GB or total number of bytes transferred, replace Mbps speed line began to form a dedicated server providers therefore offer per gigabyte.
The main players in the server market with dedicated large amounts of bandwidth ranging from 500 to 3,000 gigabytes GB model using "increase sales." It is not uncommon for key players to provide the service with 1Terabyte (TB) of the pass band or higher. Use the measurement models based on the byte level usually include a certain amount of bandwidth to each price per gigabyte server after a certain threshold has been reached. Expect to pay extra to use the bandwidth of the surplus. For example, if it has been assigned a dedicated server 3000 GB of bandwidth per month, and uses the client 5000 GB of bandwidth in the billing period, you will be charged for the additional bandwidth Go 2000 that the bandwidth of the the excess. Each supplier has a different model for the bills. Not set standards in the industry.

Monday, March 23, 2015

Parallels et Similitudes Stockage dans un Cloud (nuage)


Cloud storage similarities (Pstorage) is highly available distributed storage (virtual SAN) with built-in replication and disaster recovery.
Pstorage virtualization platform allows the storage on top devices in material with hard drives locally and was able to unite in the storage cluster in scenarios such as virtualization using virtual machines (vessels) and / or monitoring container system (CTS). Pstorage provides rapid live migration of VMS monitoring devices and CTS through the nodes without having to copy the VM / CT data, the high-availability storage as it becomes available remotely.

Advantages

Are listed in the main Pstorage features below:
There are no special hardware requirements. Basic equipment (SATA / SAS, 1GB Ethernet drives +) can be used to create storage.

Indications of strong consistency. This makes it suitable for Pstorage via the iSCSI protocol, VM and TC operates on top of that (unlike the storage objects such as Amazon S3 or SWIFT).
  • Built-in replication.
  • Automatic recovery from disasters on the hard drive or node failure.
  • High availability. Data remains accessible even for the failure of the hard drive or the contract.
  • Optional caching SSD. SSD caches enhance the overall performance of the group to write and read operations.
  • Checksumming data and washing. Test and washing greatly improve the reliability of the data.
  • Development upon request. You can add more storage nodes to the cluster to increase its disk space. Image VM / CT size is not limited by the size of one of the hard drives.
  • Petabytes tables
  • More uniform performance materials and capacity utilization on the contract, and the contract has been taken advantage of those inactive.
  •  High performance - similar to the SAN.

Thursday, March 19, 2015

Virtual Private Servers in the world revolution Virtualization

Virtual Private Server (VPS) is a virtual machine is sold as a service by Internet hosting service.VPS runs its own copy of the operating system, customers have access to the root level for that instance of the operating system, so they can install almost any software that runs on this operating system.For many applications are functionally equivalent to a dedicated physical server, and being defined by the software can be created more easily and configured. They are priced much lower than the equivalent physical server, but since they share the basic hardware with another VPS, performance may be lower and can depend on the volume of work in other cases on the same hardware node.

By default (Virtualization)

Virtual server is a driving force similar to those that led to the development of time-sharing multi-programming in the past. Although resources are still common, as in the time-sharing model, virtualization offers the highest level of security, as the default type used, individual isolated virtual servers and operating systems, most of the other and can work in itself, which can be re-alone independent start as a virtual instance.
It was the division of a single server to view multiple servers increasingly common on small computers since the launch of VMware ESX Server software in 2001. The physical server hypervisor usually works commissioned to create, release, manage and a "guest" resources, operating systems, or virtual machines. And the distribution of these guest operating a share of server resources of physical systems, usually in a way that visitors are not aware of other material resources than those allocated to it by the program of hypervisor. The VPS runs its own copy of the operating system, customers have access to the extraordinary level of this instance of the operating system, you can install almost any software that runs on the operating system, but Given the number of virtual clients usually run on a single machine and VPS usually has a limited time processor, and RAM, and disk space.
Although VMware and Frt- V dominate the default home for business, they are less frequent for VPS providers, mainly due to cost constraints - which generally use products like OpenVZ, Virtuozzo, Xen or KVM.


Accommodation hosting

Main article: Comparison of virtual hardware platform
Many companies offer virtual private server hosting or virtual dedicated server hosting an extension of the hosting services. There are many challenges that must be taken into account when proprietary software license in a multi-tenant virtual environments.With unmanaged or cars Managed Hosting, allows customers to manage its server instance.Usually host unlimited display with no limit on the amount of data transferred across the line bandwidth fixed. Typically, offered to host unlimited  with 10 Mbit / s and 100 Mbit / s or 1000 Mbit / s (with some up to 10 Gbit / s). This means that the client is theoretically able to use CT ~ 3.33 in the 10 Mbit / s ~ 33 TB 100 Mbit / s and 333 ~ TB 1000 Mbit / s monthly line (though in practice and the values ​​that will be much less). In virtual private server, it will be shared bandwidth and (should) means that there is fair use policy concerned. As host unlimited marketing is usually limited but generally because of the policies and conditions of service acceptable use. Offers unlimited disk space and bandwidth is always wrong because of the costs, and the ability to support the technological limitations.

Popular Posts