Unit 5
Unit 5
Unit 5
Amazon CloudWatch monitors your Amazon Web Services (AWS) resources and the
applications you run on AWS in real time. You can use CloudWatch to collect and track metrics,
which are variables you can measure for your resources and applications.
The CloudWatch home page automatically displays metrics about every AWS service you use.
You can additionally create custom dashboards to display metrics about your custom
applications, and display custom collections of metrics that you choose.
You can create alarms that watch metrics and send notifications or automatically make changes to
the resources you are monitoring when a threshold is breached. For example, you can monitor the
CPU usage and disk reads and writes of your Amazon EC2 instances and then use that data to
determine whether you should launch additional instances to handle increased load. You can also
use this data to stop under-used instances to save money.
With CloudWatch, you gain system-wide visibility into resource utilization, application
performance, and operational health.
How it works
Not to mention, configuring each AWS resource by hand puts you at a much higher risk of
making errors or introducing inconsistencies.
AWS CloudFormation is an AWS service that uses template files to automate the setup of
AWS resources.
You can also apply CloudFormation templates to AWS services that meet specific use cases, such
as ground stations, AWS satellite management solutions.
Cloud Computing (6CS4-06) Page 2
Unit-6 L-25 VS
In general, if a service runs on AWS, it's a safe bet that you can use CloudFormation to automate
its configuration and deployment.
It is worth noting that CloudFormation is not the only way to configure and deploy services on
AWS. You can handle these processes manually using the AWS command-line interface, API, or
Web console.
Manual provisioning is the approach that teams typically take when getting started with AWS and
learning how to deploy services. However, as they scale their environments up in size, many
teams quickly realize they need a solution like CloudFormation to make the deployment process
faster and more consistent.
How it works
AWS CloudFormation lets you model, provision, and manage AWS and third-party
resources by treating infrastructure as code.
GAE primarily supports Go, PHP, Java, Python, Node.js, .NET, and Ruby applications,
although it can also support other languages via "custom runtimes".
The service is free up to a certain level of consumed resources and only in standard
environment but not in flexible environment. Fees are charged for additional storage,
bandwidth, or instance hours required by the application.
It was first released as a preview version in April 2008 and came out of preview in
September 2011.
Any Python framework that supports the WSGI using the CGI adapter can be used
to create an application; the framework can be uploaded with the developed
application. Third-party libraries written in pure Python may also be uploaded.
Google App Engine supports many Java standards and frameworks. Core to this is
the servlet 2.5 technology using the open-source Jetty Web Server, along with
accompanying technologies such as JSP. Java Server Faces operates with some
workarounds.
A newer release of App Engine Standard Java in Beta supports Java8, Servlet 3.1
and Jetty9.
Though the integrated database, Google Cloud Datastore, may be unfamiliar to
programmers, it is accessed and supported with JPA, JDO, and by the simple low-
level API.
There are several alternative libraries and frameworks you can use to model and
map the data to the database such as Objectify, Slim3 and Jello framework.
The Spring Framework works with GAE. However, the Spring Security module (if
used) requires workarounds. Apache Struts 1 is supported, and Struts 2 runs with
workarounds.
The Django web framework and applications running on it can be used on App
Engine with modification. Django-nonrel aims to allow Django to work with non-
relational databases and the project includes support for App Engine.
Bulk downloading
SDK version 1.2.2 adds support for bulk downloads of data using Python. The open source
Python projects gaebar, approcket, and gawsh also allow users to download and back up
App Engine data. No method for bulk downloading data from GAE using Java currently
exists
Restrictions
Developers have read-only access to the filesystem on App Engine. Applications can use
only virtual filesystems, like gae-filestore.
App Engine can only execute code called from an HTTP request (scheduled background
tasks allow for self-calling HTTP requests).
Users may upload arbitrary Python modules, but only if they are pure-Python; C and Pyrex
modules are not supported.
Java applications may only use a subset (The JRE Class White List) of the classes from the
JRE standard edition. This restriction does not exist with the App Engine Standard Java8
runtime.
A process started on the server to answer a request can't last more than 60 seconds (with
the 1.4.0 release, this restriction does not apply to background jobs anymore).
Does not support sticky sessions (a.k.a. session affinity), only replicated sessions are
supported including limitation of the amount of data being serialized and time for session
serialization.
Microsoft Azure
There are many cloud computing platforms offered by different organizations. Windows Azure is
one of them, which is provided by Microsoft. Azure can be described as the managed data centers
that are used to build, deploy, manage the applications and provide services through a global
network. The services provided by Microsoft Azure are PaaS and IaaS. Many programming
languages and frameworks are supported by it.
Pros
The overall cost is low as the resources are allocated on demand and servers are
automatically updated.
It is less vulnerable as servers are automatically updated and being checked for all known
security issues. The whole process is not visible to developer and thus does not pose a risk
of data breach.
Since new versions of development tools are tested by the Azure team, it becomes easy for
developers to move on to new tools. This also helps the developers to meet the customer’s
demand by quickly adapting to new versions.
Cons
There are portability issues with using PaaS. There can be a different environment at
Azure, thus the application might have to be adapted accordingly.
Azure as IaaS (Infrastructure as a Service) It is a managed compute service that gives complete
control of the operating systems and the application platform stack to the application developers.
It lets the user to access, manage and monitor the data centers by themselves.
Pros
This is ideal for the application where complete control is required. The virtual machine
can be completely adapted to the requirements of the organization or business.
IaaS facilitates very efficient design time portability. This means application can be
migrated to Windows Azure without rework. All the application dependencies such as
database can also be migrated to Azure.
aaS allows quick transition of services to clouds, which helps the vendors to offer services
to their clients easily. This also helps the vendors to expand their business by selling the
existing software or services in new markets.
Cons
Since users are given complete control they are tempted to stick to a particular version for
the dependencies of applications. It might become difficult for them to migrate the
application to future versions.
There are many factors which increases the cost of its operation. For example, higher
server maintenance for patching and upgrading software.
It is difficult to maintain legacy apps in Iaas. It can be stuck with the older version of the
operating systems and application stacks. Thus, resulting in applications that are difficult to
maintain and add new functionality over the period of time.
A free trial account can be created on Azure management portal by visiting the following link -
manage.windowsazure.com
Once logged in, you will be redirected to the following screen, where there is a list of services
and applications on the left panel.
Inside the datacenter, there are many machines or servers aggregated by a switch. We can say that
fabric controller is a brain of the azure service that analyses the processes and makes decisions.
Fabrics are group of machines in Microsoft’s datacenter which are aggregated by a switch. The
group of these machines is called cluster. Each cluster is managed and owned by a fabric
controller. They are replicated along with these machines. It manages everything inside those
machines, for e.g., load balancers, switches, etc. Each machine has a fabric agent running inside it
and fabric controller can communicate with each fabric agent.
When a user chooses one of the virtual machine, the operating system, patch updates and
software updates are performed by fabric controller. It decides where the new application should
run which is one of the most important functions of Fabric Controller. It also selects the physical
server to optimize hardware utilization.
When a new application is published in Azure, an application configuration file written in XML
is also attached. The fabric controller reads those files in Microsoft datacenter and makes the
setting accordingly.
In addition to managing the allocation of resources to a specific application, it also monitors the
health of compute and storage services. It also makes the failure recoveries for a system.
Cloud Computing (6CS4-06) Page 5
Unit-6 L-26 VS
Imagine a situation where four instances of web role are running, and one of them dies. The
fabric controller will initiate a new instance to replace the dead one immediately. Similarly, in
case any virtual machine fails, a new one is assigned by the fabric controller. It also resets the
load balancers after assigning the new machine, so that it points to the new machine
instantaneously. Thus, all the intelligent tasks are performed by the Fabric Controller in Windows
Azure architecture.
The Storage component of Windows Azure represents a durable store in the cloud. Windows
Azure allows developers to store tables, blobs, and message queues. The storage can be accessed
through HTTP. You can also create our own client; although Windows Azure SDK provides a
client library for accessing the Storage.
The word ‘Blob’ expands to Binary Large OBject. Blobs include images, text files, videos and
audios. There are three types of blobs in the service offered by Windows Azure namely block,
append and page blobs.
Block blobs are collection of individual blocks with unique block ID. The block blobs allow the
users to upload large amount of data.
Append blobs are optimized blocks that helps in making the operations efficient.
Page blobs are compilation of pages. They allow random read and write operations. While
creating a blob, if the type is not specified they are set to block type by default. All the blobs must
be inside a container in your storage.
Amazon Web Services provides cloud services from tens of data centers and multiple
availability zones (AZs) spread across regions of the world. Each AZ contains multiple data
centers. Customers can setup virtual machines and replicate their data in multiple AZs in order
to have a highly resilient system that’s resistant to a server or data center failure
1. Compute
This is the flagship product of Amazon Web Services. Its Elastic Compute Cloud (EC2)
provides instances (virtual servers) for cloud computing capacity. EC2 has numerous instance
types to choose from each of varying size and capacity. Instances are tailored to suit specific
applications and workload types such as accelerated computing and memory intensive jobs. It
has auto scaling to accommodate evolving performance, capacity and system health needs.
The EC2 Container Service and Registry provide images and Docker containers that
customers can work with.
2. Storage
Simple Storage Service (S3) is a scalable storage that’s ideal for archival, data backup and
analytics. Files and data are stored in units referred to as S3 objects, which can be up to 5 GB
in size. The objects are stored in S3 buckets for better organization. Businesses can cut their
costs of S3 storage by opting for the Infrequent Access tier or, for longer-term cold storage,
use Amazon Glacier.
Elastic Block Store is a service that provides persistent block storage that’s ideal for EC2
instances while the Elastic File System is a managed cloud-based storage service.
3. Data management
The Amazon Relational Database Service provides managed data services with options for
major databases including Amazon Aurora, MySQL, Oracle, PostgreSQL, SQL Server,
MariaDB and (through DynamoDB) NoSQL. Customers can use DynamoDB Accelerator and
Amazon Elastic ache as a cache for applications that require real-time command response.
Amazon Redshift is a data warehouse that simplifies the process of data analysis and business
intelligence.
service gives administrators firm control over an isolated portion of AWS cloud that forms
their own virtual network. Amazon Web Services provisions resources automatically within
Cloud Computing (6CS4-06) Page 2
Unit-6 L-27 VS
the VPC. Administrators can stay on top of network traffic with Network Load Balancer,
Application Load Balancer and other load balancing tools from Amazon Web Services.
6. Monitoring
Administrators can track and manage their AWS cloud via AWS Config, AWS Config
Rules and AWS Trusted Advisor. These help IT teams avoid needlessly expensive and
improperly configured cloud deployments. Administrators can also automate the
process of infrastructure and system provisioning and configuration with
CloudFormation templates, Chef and AWS OpsWork. They can monitor application
and resource health with CloudWatch and Personal Health Dashboard while using
CloudTrail to retain user activity and API calls for later auditing. There are many more
ideas on AWS monitoring including the user of third party tools.
ANEKA:
Aneka is the product of Manjarasoft. Aneka is used for developing, deploying and
managing cloud applications.
Aneka can be integrated with existing cloud technologies.
Aneka includes extensible set of APIs associated with programming models like
MapReduce.
These APIs supports different types of cloud models like private, public, hybrid cloud.
Aneka framework:
Aneka is a pure PaaS solution for cloud computing.
Aneka is a cloud middleware product.
Aneka can be deployed on a network of computers, a multicore server, datacenters,
virtual cloud infrastructures, or a mixture of these.
Aneka container can be classified into three major categories: Fabric Services
Foundation Services Application Services
1. Fabric services:
Fabric Services define the lowest level of the software stack representing the Aneka
Container. They provide access to the resource-provisioning subsystem and to the
monitoring facilities implemented in Aneka.
2. Foundation services:
Fabric Services are fundamental services of the Aneka Cloud and define the basic
infrastructure management features of the system. Foundation Services are related
to the logical management of the distributed system built on top of the infrastructure
and provide supporting services for the execution of distributed applications.
3. Application services:
Application Services manage the execution of applications and constitute a layer
that differentiates according to the specific programming model used for developing
distributed applications on top of Aneka.
The Aneka PaaS is built on a solid .NET service oriented architecture allowing
seamless integration between public Clouds and mainstream applications. The core
capabilities of the framework are expressed through its extensible and flexible
architecture as well as its powerful application models featuring support for several
distributed and parallel programming paradigms. These features enhance the
development experience of software developers allowing them to rapidly prototype
elastically scalable applications. Applications ranging from the media and
entertainment industry, to engineering, education, health and life sciences and several
others have been proven to be appropriate to the Aneka PaaS.
The integration of two platforms would give numerous benefits to not only the users of
Aneka PaaS but also the customers of Windows Azure Platform, enabling them to
embrace the advantages of Cloud computing in terms of more computing resources,
easier programming model, and more efficiency on application execution at lower
expense and lower administration overhead.
Now a days, cloud computing has a great significance in the fields of geology, biology, and
other scientific research areas.
Protein structure prediction is the best example in research area that makes use of cloud
applications for its computation and storage.
A protein is composed of long chains of amino acids joined together by peptide bonds. The
various structures of protein help in the designing of new drugs and the various sequences
of proteins from its three-dimensional structure in predictive form is known as a Protein
structure prediction.
Firstly, primary structures of proteins are formed and then prediction of the secondary,
tertiary and quaternary structures are done from the primary one. In this way predictions of
protein structures are done.
Protein structure prediction also makes use of various other technologies like artificial
neural networks, artificial intelligence, machine learning and probabilistic techniques, also
holds great importance in fields like theoretical chemistry and bioinformatics.
There are various algorithms and tools that exists for protein structure prediction. CASP
(Critical Assessment of Protein Structure Prediction) is a well-known tool that provides
methods for automated web servers and the results of research work are placed on clouds
like CAMEO (Continuous Automated Model Evaluation) server.
These servers can be accessed by anyone as per their requirements from any place.
Some of the tools or servers used in protein structure prediction are Phobius, FoldX,
LOMETS, Prime, Predict protein, SignalP, BBSP, EVfold, Biskit, HHpred, Phre,
ESyired3D. Using these tools new structures are predicted and the results are placed on the
cloud-based servers.
Data Analysis:
Businesses have long used data analytics to help direct their strategy to
maximize profits.
Ideally data analytics helps eliminate much of the guesswork involved in
trying to understand clients, instead systemically tracking data patterns to
best construct business tactics and operations to minimize uncertainty.
Not only does analytics determine what might attract new customers, often
analytics recognizes existing patterns in data to help better serve existing
customers, which is typically more cost effective than establishing new
business.
In an ever-changing business world subject to countless variants, analytics
gives companies the edge in recognizing changing climates so they can take
initiate appropriate action to stay competitive.
instead remotely manage inventories from data automatically uploaded to cloud drives.
The data stored to clouds helps make business run more efficiently and gives companies a
better understanding of their customers’ behavior.
Majorly there are four kinds of resolutions associated with satellite imagery.
These are:
1. Spatial resolution –
It is determined by the sensors Instantaneous Field of View(IFoV) and is defined as the
pixel size of an image that is visible to the human eye being measured on the ground. Since
it has high resolving power or the ability to separate and hence is termed as Spatial
Resolution.
2. Spectral resolution –
This resolution measures the wavelength internal size and determines the number of
wavelength intervals that the sensor measures.
3. Temporal resolution –
The word temporal is associated with time or days and is defined as the time that passes
between various imagery cloud periods.
4. Radiometric resolution –
This resolution provides the actual characteristics of the image and is generally expressed
in bits size. It gives the effective bit depth and records the various levels of brightness of
imaging system.
CRM stands for Customer Relationship Management and is a software that is hosted in
cloud so that the users can access the information using internet. CRM software provides
high level of security and scalability to its users and can be easily used on mobile phones to
access the data. Some of the major CRM vendors include Oracle Siebel, Mother node
CRM, Microsoft Dynamics CRM, Infor CRM, SAGE CRM, NetSuite CRM.
Advantages:
1. High reliability, flexibility and scalability
2. Easy to use
3. Highly secured
4. Easily accessible
ERP is an abbreviation for Entity Resource Planning and is a software similar to CRM
that is hosted on cloud servers which helps the enterprises to manage and manipulate their
business data as per their needs and user requirements.
ERP software follows pay per use methodologies of payment, that is at the end of the
month, the enterprise pay the amount as per the cloud resources utilized by them. There are
various ERP vendors available like Oracle, SAP, Epicor, SAGE, Microsoft Dynamics,
Lawson Softwares and many more.
Advantages:
1. Cost effective
2. High mobility
3. Increase in productivity
4. No security issues
5. Scalable and efficient
Social Applications:
Social cloud applications allow a large number of users to connect with each other
using social networking applications such as Facebook, Twitter, Linkedln, etc. There
are the following cloud based social applications –
1. Facebook:
Facebook is a social networking website which allows active users to share files,
photos, videos, status, more to their friends, relatives, and business partners using
the cloud storage system. On Facebook, we will always get notifications when our
friends like and comment on the posts.
2. Twitter:
Twitter is a social networking site. It is a microblogging system. It allows users to
follow high profile celebrities, friends, relatives, and receive news. It sends and
receives short posts called tweets.
3. Yammer
Yammer is the best team collaboration tool that allows a team of employees to chat,
share images, documents, and videos.
4. Linked
In LinkedIn is a social network for students, freshers, and professionals.
5. Business Applications
Business applications are based on cloud service providers. Today, every
organization requires the cloud business application to grow their business. It also
ensures that business applications are 24*7 available to users. There are the
following business applications of cloud computing –
6. MailChimp
MailChimp is an email publishing platform which provides various options to
design, send, and save templates for emails.
7. Salesforce:
Salesforce platform provides tools for sales, service, marketing, ecommerce, and
more. It also provides a cloud development platform.
8. Bitrix24
Bitrix24 is a collaboration platform which provides communication, management,
and social collaboration tools.
9. Paypal
Paypal offers the simplest and easiest online payment mode using a secure internet
account. Paypal accepts the payment through debit cards, credit cards, and also
from Paypal account holders.
10. Slack
Slack stands for Searchable Log of all Conversation and Knowledge. It provides a
user-friendly interface that helps us to create public and private channels for
communication.
11. Quick books
Quick books work on the terminology "Run Enterprise anytime,
anywhere, on any device." It provides online accounting solutions for the business.
It allows more than 20 users to work simultaneously on the same system.
12. Chatter
Chatter helps us to share important information about the organization in real time.
Scientific Applications:
Scientific applications is a kind of numerical calculation, and it is mathematical
calculation in computer processing of scientific research and engineering. Scientific
Cloud Federations:
An interconnected set of heterogeneous public and/or private clouds from
voluntarily participating users and providers.
Cloud Federation, also known as Federated Cloud is the deployment and
management of several external and internal cloud computing services to match
business needs. It is a multi-national cloud system that integrates private,
community, and public clouds into scalable computing platforms. Federated cloud
is created by connecting the cloud environment of different cloud providers using a
common standard.
1. Cloud Exchange
The Cloud Exchange acts as a mediator between cloud coordinator and cloud broker.
The demands of the cloud broker are mapped by the cloud exchange to the available
services provided by the cloud coordinator. The cloud exchange has a track record of
what is the present cost, demand patterns, and available cloud providers, and this
information is periodically reformed by the cloud coordinator.
2. Cloud Coordinator:
The cloud coordinator assigns the resources of the cloud to the remote users based on the
quality of service they demand and the credits they have in the cloud bank. The cloud
enterprises and their membership are managed by the cloud controller.
3. Cloud Broker
The cloud broker interacts with the cloud coordinator, analyzes the Service-level
agreement and the resources offered by several cloud providers in cloud exchange.
Cloud broker finalizes the most suitable deal for their client.
When using a third party for cloud computing infrastructure you know you are
benefitting from a service where by the staff are highly trained in this field and the
company has all the Resources necessary. It’s unlikely you would have this if you
were to opt for a personal cloud, meaning lots of time and Money would need to be
invested.
2. Security benefits –
A lot of companies feel more secure putting their data in the hands of an experienced
cloud computing provider rather than jumping into the unknown and trying to
manage the security of their pivotal data themselves.
4. Cost advantages –
Third party clouds are particularly advantageous for small businesses and such like
since they do not require huge outlays. To be able to bring your infrastructure in-
house you would need to make a sizeable investment
5. Maintenance and support –
If something goes wrong it is the duty of the provider to ensure the problem is fixed.
Security worries - Concerns about security is one of the main reasons why businesses
consider bringing their cloud infrastructure in-house. By doing this, you are entirely
responsible for the security of your data. Yet, time and resources will need to be
heavily invested to get it right.
6. Lack of control –
With third party cloud computing you have minimal control over the likes of how
quickly you can expand the cloud, the granularity of its management, how it is used
and deployed, and such like.
7. Potential cost drawbacks - If you were to go down the route of a personal cloud you
would be able to keep your on-going costs to a minimum, although the upfront
expenses will likely be high. Also, with third party computing you will need to pay
for more space whenever you run out.
As you can see, there are pros and cons associated with third party cloud computing.
There is no right or wrong answer when it comes to determining whether this is an
effective solution. It all depends on your business and what is right for you. Personal
clouds can be successful, but only if you have the skill and resources to be able to
build and manage the cloud. If not, it’s not a risk worth taking