Amazon Web Services (AWS) :
Amazon Web Services (AWS) is a comprehensive, evolving cloud computing
platform provided by Amazon.com. Web services are sometimes called cloud services or
remote computing services. The first AWS offerings were launched in 2006 to provide online services
for websites and client-side
applications.
To minimize the impact of outages and ensure robustness of the system, AWS is geographically diversified into regions. These regions have central hubs in the Eastern USA, Western USA (two locations), Brazil, Ireland, Singapore, Japan, and Australia. Each region comprises multiple smaller geographic areas called availability zones.
To minimize the impact of outages and ensure robustness of the system, AWS is geographically diversified into regions. These regions have central hubs in the Eastern USA, Western USA (two locations), Brazil, Ireland, Singapore, Japan, and Australia. Each region comprises multiple smaller geographic areas called availability zones.
The growing AWS collection offers over three dozen diverse services including:
- CloudDrive, which allows users to upload and access music, videos, documents, and photos from Web-connected devices. The service also enables users to stream music to their devices.
- CloudSearch, a scalable search service typically used to integrate customized search capabilities into other applications.
- Dynamo Database (also known as DynamoDB or DDB), a fully-managed NoSQL database service known for low latencies and scalability.
- Elastic Compute Cloud, which allows business subscribers to run application programs and can serve as a practically unlimited set of virtual machines (VMs).
- ElastiCache, a fully managed caching service that is protocol-compliant with Memcached, an open source, high-performance, distributed memory object caching system for speeding up dynamic Web applications by alleviating database load.
- Mechanical Turk, an application program interface (API) that allows developers to integrate human intelligence into Remote Procedure Calls (RPCs) using a network of humans to perform tasks that computers are ill-suited for.
- RedShift, a petabyte-scale data warehouse service designed for analytic workloads, connecting to standard SQL-based clients and business intelligence tools.
- Simple Storage Service (S3), a scalable, high-speed, low-cost service designed for online backup and archiving of data and application programs.
All AWS offerings are billed according to usage. The rates vary from service to service.
Difference Between AWS & Windows AZURE Cloud services.
Parameter
|
AMAZON Web
Services
|
Windows Azure
|
Deployment
Model
|
Public
Cloud
|
Public
Cloud
|
Service Model
|
Infrastructure
as a Service
|
|
Industries
|
|
|
Control
Interface
|
|
|
Features
|
|
|
Server OS Types
|
|
|
4 Basic Principles of Cloud Computing:-
The Right Kind of Workload Optimization
At
the core of cloud computing is this idea of optimizing the workload.
This allows you to make the most of your IT resources while increasing
your overall flexibility. Power Systems use technology like IBM’s New
Intelligent Threads to switch between processor threading dynamically.
The Power Systems TurboCore mode lets you provide the most performance
per core for things like database or transaction workloads. Active
Memory expansion lets you expand your physical memory logically by as
much as 100 percent for memory-intensive workloads like SAP.Limitless Virtualization
With PowerVM, the virtualization component to IBM Power Systems, you can virtualize not just processor resources, but memory and I/O resources as well. You can use PowerVM to adjust capacity in a dynamic fashion, to move workloads between servers, and to maximize availability. This kind of virtualization even allows you to prevent planned downtime.
Automated Management
Being able to provision resources within the cloud is key to maximizing utilization and efficiency. It also helps to reduce your TCO and management costs. Utilizing IBM Systems Director Enterprise for Power Systems, you have a way to manage physical as well as virtual servers in an automated fashion. These tools are cross-platform, too. This means that, no matter what your environment, the Power Systems cloud can provision virtual machine images and effectively allocate resources, all while providing you with an accurate picture of how your systems are operating.
Solutions of All Kinds
No matter the shape, size or composition of your cloud, IBM Power Systems has a possible solution. Here are a few of the specific offerings:
• IBM CloudBurst. CloudBurst lets the data center quickly create and implement a private cloud environment. It’s a cloud computing quickstart aimed at a defined portion of the data center.
• IBM WebSphere CloudBurst Appliance. This offering lets you deploy and manage your SOA foundation in a cloud computing environment, and easily deploys WebSphere virtual images to your Power Systems partitions.
• IBM Smart Business Development and Test Cloud. This solution lets you create a private cloud environment for the purposes of development and testing, reducing your operating costs and your test cycle times.
Advantages of Cloud Computing :-
Cost Efficiency
This is the biggest advantage of cloud computing, achieved by the elimination of the investment in stand-alone software or servers. By leveraging cloud’s capabilities, companies can save on licensing fees and at the same time eliminate overhead charges such as the cost of data storage, software updates, management etc.The cloud is in general available at much cheaper rates than traditional approaches and can significantly lower the overall IT expenses. At the same time, convenient and scalable charging models have emerged (such as one-time-payment and pay-as-you-go), making the cloud even more attractive.
If you want to get more technical and analytical, cloud computing delivers a better cash flow by eliminating the capital expense (CAPEX) associated with developing and maintaining the server infrastructure.
Convenience and continuous availability
Public clouds offer services that are available wherever the end user might be located. This approach enables easy access to information and accommodates the needs of users in different time zones and geographic locations. As a side benefit, collaboration booms since it is now easier than ever to access, view and modify shared documents and files.
Moreover, service uptime is in most cases guaranteed, providing in that way continuous availability of resources. The various cloud vendors typically use multiple servers for maximum redundancy. In case of system failure, alternative instances are automatically spawned on other machines.
Backup and Recovery
The process of backing up and recovering data is simplified since those now reside on the cloud and not on a physical device. The various cloud providers offer reliable and flexible backup/recovery solutions. In some cases, the cloud itself is used solely as a backup repository of the data located in local computers.
Cloud is environmentally friendly
The cloud is in general more efficient than the typical IT infrastructure and It takes fewer resources to compute, thus saving energy. For example, when servers are not used, the infrastructure normally scales down, freeing up resources and consuming less power. At any moment, only the resources that are truly needed are consumed by the system.
Resiliency and Redundancy
A cloud deployment is usually built on a robust architecture thus providing resiliency and redundancy to its users. The cloud offers automatic failover between hardware platforms out of the box, while disaster recovery services are also often included.
Scalability and Performance
Scalability is a built-in feature for cloud deployments. Cloud instances are deployed automatically only when needed and as a result, you pay only for the applications and data storage you need. Hand in hand, also comes elasticity, since clouds can be scaled to meet your changing IT system demands.
Regarding performance, the systems utilize distributed architectures which offer excellent speed of computations. Again, it is the provider’s responsibility to ensure that your services run on cutting edge machinery. Instances can be added instantly for improved performance and customers have access to the total resources of the cloud’s core hardware via their dashboards.
Quick deployment and ease of integration
A cloud system can be up and running in a very short period, making quick deployment a key benefit. On the same aspect, the introduction of a new user in the system happens instantaneously, eliminating waiting periods.
Furthermore, software integration occurs automatically and organically in cloud installations. A business is allowed to choose the services and applications that best suit their preferences, while there is minimum effort in customizing and integrating those applications.
Increased Storage Capacity
The cloud can accommodate and store much more data compared to a personal computer and in a way offers almost unlimited storage capacity. It eliminates worries about running out of storage space and at the same time It spares businesses the need to upgrade their computer hardware, further reducing the overall IT cost.
Device Diversity and Location Independence
Cloud computing services can be accessed via a plethora of electronic devices that are able to have access to the internet. These devices include not only the traditional PCs, but also smartphones, tablets etc. With the cloud, the “Bring your own device” (BYOD) policy can be easily adopted, permitting employees to bring personally owned mobile devices to their workplace.
An end-user might decide not only which device to use, but also where to access the service from. There is no limitation of place and medium. We can access our applications and data anywhere in the world, making this method very attractive to people. Cloud computing is in that way especially appealing to international companies as it offers the flexibility for its employees to access company files wherever they are.
Smaller learning curve
Cloud applications usually entail smaller learning curves since people are quietly used to them. Users find it easier to adopt them and come up to speed much faster. Main examples of this are applications like GMail and Google Docs.
Disadvantages of Cloud Computing :-
As made clear from the above, cloud computing is a tool that offers enormous benefits to its adopters. However, being a tool, it also comes with its set of problems and inefficiencies. Let’s address the most significant ones.Security and privacy in the Cloud
Security is the biggest concern when it comes to cloud computing. By leveraging a remote cloud based infrastructure, a company essentially gives away private data and information, things that might be sensitive and confidential. It is then up to the cloud service provider to manage, protect and retain them, thus the provider’s reliability is very critical. A company’s existence might be put in jeopardy, so all possible alternatives should be explored before a decision. On the same note, even end users might feel uncomfortable surrendering their data to a third party.
Similarly, privacy in the cloud is another huge issue. Companies and users have to trust their cloud service vendors that they will protect their data from unauthorized users. The various stories of data loss and password leakage in the media does not help to reassure some of the most concerned users.
Dependency and vendor lock-in
One of the major disadvantages of cloud computing is the implicit dependency on the provider. This is what the industry calls “vendor lock-in” since it is difficult, and sometimes impossible, to migrate from a provider once you have rolled with him. If a user wishes to switch to some other provider, then it can be really painful and cumbersome to transfer huge data from the old provider to the new one. This is another reason why you should carefully and thoroughly contemplate all options when picking a vendor.
Technical Difficulties and Downtime
Certainly the smaller business will enjoy not having to deal with the daily technical issues and will prefer handing those to an established IT company, however you should keep in mind that all systems might face dysfunctions from time to time. Outage and downtime is possible even to the best cloud service providers, as the past has shown.
Additionally, you should remember that the whole setup is dependent on internet access, thus any network or connectivity problems will render the setup useless. As a minor detail, also keep in mind that it might take several minutes for the cloud to detect a server fault and launch a new instance from an image snapshot.
Limited control and flexibility
Since the applications and services run on remote, third party virtual environments, companies and users have limited control over the function and execution of the hardware and software. Moreover, since remote software is being used, it usually lacks the features of an application running locally.
Increased Vulnerability
Related to the security and privacy mentioned before, note that cloud based solutions are exposed on the public internet and are thus a more vulnerable target for malicious users and hackers. Nothing on the Internet is completely secure and even the biggest players suffer from serious attacks and security breaches. Due to the interdependency of the system, If there is a compromise one one of the machines that data is stored, there might be a leakage of personal information to the world.
Big Data :-
Big data is
an evolving term that describes any voluminous amount of structured,
semi-structured and unstructured data that has the potential to be mined
for information.Big data can be characterized by 3Vs: the extreme
volume of data, the wide variety of types of data and the velocity at
which the data must be must processed. Although big data doesn't refer
to any specific quantity, the term is often used when speaking about
petabytes and exabytes of data, much of which cannot be integrated
easily.Because big data takes too much time and costs too much money to
load into a traditional relational database for analysis, new approaches
to storing and analyzing data have emerged that rely less on data
schema and data quality. Instead, raw data with extended metadata is
aggregated in a data lake and machine learning and artificial
intelligence (AI) programs use complex algorithms to look for repeatable
patterns.
Load Balancing:-
Load
balancing is dividing the amount of work that a computer has to do
between two or more computers so that more work gets done in the same
amount of time and, in general, all users get served faster. Load
balancing can be implemented with hardware, software, or a combination
of both. Typically, load balancing is the main reason for computer
server clustering. Load balancing is dividing the amount of work that a
computer has to do between two or more computers so that more work gets
done in the same amount of time and, in general, all users get served
faster. Load balancing can be implemented with hardware, software, or a
combination of both. Typically, load balancing is the main reason for
computer server clustering. On the Internet, companies whose Web sites
get a great deal of traffic usually use load balancing. For load
balancing Web traffic, there are several approaches. For Web serving,
one approach is to route each request in turn to a different server host
address in a domain name system (DNS) table, round-robin fashion.
Usually, if two servers are used to balance a work load, a third server
is needed to determine which server to assign the work to. Since load
balancing requires multiple servers, it is usually combined with
failover and backup services. In some approaches, the servers are
distributed over different geographic locations.
1) Speed is fast.
2) Size of cipher text is usually the same or less than that of the plain text.
3) Number of keys used is the square of the number of participants.
4) Key exchange is a major problem (hence, algorithms like the Diffie-Hellman Key Exchange algorithm are used).
5) More storage space required.
6) symmetric encryption model eliminating the need to share the key by using a pair of public-private keys.
7) Symmetric Encryption is an age old technique while.
8) Symmetric Encryption uses a single secret key that needs to be shared among the people who needs to receive the message.
Asymmetric encryption:-
1) Slower in speed.
2) Cipher text size is usually greater than that of the plain text.
3) Number of keys used is same as the number of participants.
4) Key exchange is no problem.
5) Less storage space required.
6) Asymmetric Encryption was introduced to complement the inherent problem of the need to share the key.
7) Asymmetric Encryption is relatively new.
8) Asymmetric encryption uses a pair of public key, and a private key to encrypt and decrypt messages when communicating.
Difference between symmetric and asymmetric encryption techniques:_
Symmetric Encryption:-1) Speed is fast.
2) Size of cipher text is usually the same or less than that of the plain text.
3) Number of keys used is the square of the number of participants.
4) Key exchange is a major problem (hence, algorithms like the Diffie-Hellman Key Exchange algorithm are used).
5) More storage space required.
6) symmetric encryption model eliminating the need to share the key by using a pair of public-private keys.
7) Symmetric Encryption is an age old technique while.
8) Symmetric Encryption uses a single secret key that needs to be shared among the people who needs to receive the message.
Asymmetric encryption:-
1) Slower in speed.
2) Cipher text size is usually greater than that of the plain text.
3) Number of keys used is same as the number of participants.
4) Key exchange is no problem.
5) Less storage space required.
6) Asymmetric Encryption was introduced to complement the inherent problem of the need to share the key.
7) Asymmetric Encryption is relatively new.
8) Asymmetric encryption uses a pair of public key, and a private key to encrypt and decrypt messages when communicating.
How to Reduce Security Breaches in Cloud Computing Networks :-
In general, follow these steps to reduce the risk of suffering security breaches:- Authenticate all people accessing the network.
- Frame all access permissions so users have access only to the applications and data that they’ve been granted specific permission to access.
- Authenticate all software running on any computer — and all changes to such software.This includes software or services running in the cloud.Your cloud provider needs to automate and authenticate software patches and configuration changes, as well as manage security patches in a proactive way. After all, many service outages come from configuration mistakes.
- Formalize the process of requesting permission to access data or applications.This applies to your own internal systems and the services that require you to put your data into the cloud.
- Monitor all network activity and log all unusual activity.Deploy intruder-detection technology. Even if your cloud services provider enables you to monitor activities on its environment, you should have an independent view.Even when cloud operators have good security (physical, network, OS, application infrastructure), it is your company’s responsibility to protect and secure your applications and information.
- Log all user activity and program activity and analyze it for unexpected behavior.Nearly 70 percent of security breaches are caused by insiders (or by people getting help from insiders). Insiders rarely get caught.
- Encrypt, up to the point of use, all valuable data that needs extra protection.
- Regularly check the network for vulnerabilities in all software exposed to the Internet or any external users.
Post a Comment