cs notes
cs notes
3.Interfering with computer system without rights and intentional interference with computer data
without rights;
4. The use of inauthentic data with intent to put it across as authentic (data forgery);
Key Concepts:
Public-Key Certificate: A digitally signed statement linking an entity’s identity with a
public key. It’s used for non-repudiation and data integrity. A certificate includes:
1. X.509 version information
2. A unique serial number
3. Common name of the subject
4. Public key associated with the subject
5. Subject’s name (creator)
6. Certificate issuer information
7. Issuer's signature
8. Signature algorithm information
9. Optional X.509 v3 extensions (e.g., to distinguish CA certificates from end-entity
certificates)
X.509 Certificates and Applications: These certificates are widely used in web
browsers (e.g., Netscape Navigator, Microsoft Internet Explorer) to support the
Secure Socket Layer (SSL) protocol for privacy and authentication in network traffic.
Additional applications include:
Code-signing schemes: Such as Java Archives (JAR) and Microsoft Authenticode.
Secure E-Mail standards: Including PEM and S/MIME.
E-Commerce protocols: Like Secure Electronic Transactions (SET).
Representation of Digital Signatures in the ITA 2000
The ITA 2000 established digital signatures based on an asymmetric cryptosystem and hash
functions as the only valid form of authentication for electronic documents, equating them
to paper signatures. However, the Act has several oversights and limitations that could
impact the effectiveness of its digital signature framework.
Key Issues and Oversights in ITA 2000
1. Licensing of Certifying Authorities (CAs):
Requirements for Licensing: Section 21 mandates that CAs meet specific
qualifications regarding expertise, manpower, financial resources, and infrastructure.
These requirements are set by the Central Government, and licenses are valid only
for a prescribed period, making them non-transferable.
Short Licensing Periods: The Act allows the government to set short licensing
periods, potentially as brief as one year. This is problematic because CAs need
substantial investment in infrastructure and resources to operate, and short
licensing periods may prevent them from breaking even before renewal. A minimum
of five years is recommended to ensure financial viability for CAs.
Transferability Restrictions: Non-transferable licenses limit the flexibility for CAs to
enter partnerships or sell their business if it becomes financially unviable. This
restriction can affect the certificate holders who may be left unsupported if a CA
closes down. A more flexible policy on ownership transfer could help address this
issue.
2. Licensing of Foreign CAs:
Complex Licensing Process for Foreign CAs: For foreign CAs to operate in India, they
must obtain approval from the Controller and maintain a physical office in India,
displaying their license as per Section 32. Additionally, the Central Government’s
permission is required, and the approval must be published in the Gazette.
Impact on International Certificates: Without approval from Indian authorities,
digital certificates issued by foreign CAs, such as VeriSign, may be invalid under
Indian law. This restriction could lead to complications for Indian users holding
foreign certificates and for international business contracts where the foreign
counterpart holds a certificate from an unlicensed foreign CA. A suggested solution is
to accept foreign certificates issued by CAs already approved in their home
countries, as is done in some jurisdictions.
3. Certification Practice Statement Requirement:
Inappropriate Application to Individuals: Section 35, subsection (3), mistakenly
requires applicants for a digital signature certificate to submit a Certification Practice
Statement (CPS). This requirement, intended for CAs, was likely misapplied to
individual applicants. This oversight imposes unnecessary technical burdens on
individual users and indicates a lack of clarity in the drafting of the Act.
Cyber-cafes soon drew further attention due to security threats, with instances of
terrorists using them to send threatening emails, fraudulent banking activities, and
harassment through obscene messages. This compelled authorities to recognize
cyber-cafes as key intermediaries needing oversight. However, the original IT Act of
2000 didn't specifically define "cyber-cafe," creating ambiguity regarding their
responsibilities and liabilities.
There are number of contexts involved in actually identifying a piece of digital evidence:
1. Physical context: It must be definable in its physical form, that is, it should reside on a
specific piece of media.
Logical context: It must be identifiable as to its logical position, that is, where does it
reside relative to the fi le system.
Legal context: We must place the evidence in the correct context to read its meaning. T
is may require looking at the evidence as machine language, for example, American
Standard Code for Information Interchange (ASCII).
Following are some guidelines for the (digital) evidence collection phase:
1. Adhere to your site’s security policy and engage the appropriate incident handling
and law enforcement personnel.
2. Capture a picture of the system as accurately as possible.
3. Keep detailed notes with dates and times. If possible, generate an automatic
transcript (e.g., on Unix systems the “script” program can be used; however, the output
fi le it generates should not be given to media as that is a part of the evidence). Notes
and printouts should be signed and dated.
4. Note the difference between the system clock and Coordinated Universal Time (UTC).
For each timestamp provided, indicate whether UTC or local time is used (since 1972
over 40 countries throughout the world have adopted UTC as their official time source).
5. Be prepared to testify (perhaps years later) outlining all actions you took and at what
times. Detailed notes will be vital.
6. Minimize changes to the data as you are collecting it. T is is not limited to content
changes; avoid updating fi le or directory access times.
7. Remove external avenues for change.
8. When confronted with a choice between collection and analysis you should do
collection first and analysis later.
9. Needless to say, your procedures should be implementable. As with any aspect of an
incident response policy, procedures should be tested to ensure feasibility, particularly,
in a crisis. If possible, procedures should be automated for reasons of speed and
accuracy. Being methodical always helps.
10. For each device, a systematic approach should be adopted to follow the guidelines
laid down in your collection procedure. Speed will often be critical; therefore, where
there are a number of devices requiring examination, it may be appropriate to spread
the work among your team to collect the evidence in parallel. However, on a single
given system collection should be done step by step.
11. Proceed from the volatile to the less volatile; order of volatility is as follows:
• Registers, cache (most volatile, i.e., contents lost as soon as the power is turned OFF);
• routing table, Address Resolution Protocol (ARP) cache, process table, kernel
statistics, memory;
• temporary file systems;
• disk;
• remote logging and monitoring data that is relevant to the system in question;
• physical configuration and network topology;
• archival media (least volatile, i.e., holds data even after power is turned OFF).
12. You should make a bit-level copy of the system’s media. If you wish to do forensics
analysis you should make a bit-level copy of your evidence copy for that purpose, as
your analysis will almost certainly alter file access times. Try to avoid doing forensics on
the evidence copy.
RFC 2822:
RFC 2822, also known as the Internet Message Format standard, specifies the syntax for
email message headers and the format for email addresses. It provides rules for valid email
address formats and emphasizes certain characteristics that must be followed, such as the
Message-ID header.
Below is a breakdown of some important aspects related to RFC 2822, as well as an
overview of email headers and tracking.
Key Valid Email Address Formats (RFC 2822)
RFC 2822 allows various valid formats for email addresses. Some examples include:
1.Standard Format: [email protected]
2.IP Address in Brackets: john@[10.0.3.19]
3.Quoted Strings: "Joshi Ganesh"@host.net or "Joshi Ganesh"@[10.0.3.19]
Common Invalid Email Formats
Multiple "@" Symbols: joshi@[email protected] — Two "@" symbols are not allowed.
Leading Dot: [email protected] — Domain names cannot have a leading dot.
Leading Dash in Domain: [email protected] — Domain names cannot begin with a
dash.
Invalid TLD: [email protected] — "web" is not a valid top-level domain (TLD).
Invalid IP Format: joshi@[10.0.3.1999] — The IP address is not valid.
E-Mail Header Fields
RFC 2822 specifies the structure of email headers, which may include fields such as:
From: The sender’s email address.
To: The recipient's email address.
Subject: The subject of the email.
Date: The date and time the email was sent.
Message-ID: A unique identifier for each email message, typically enclosed in angle
brackets.
Message-ID
The Message-ID header is essential and must have a globally unique identifier. It helps
identify individual emails and is used in the "Message-ID," "In-Reply-To," and "References"
headers. For an email to be valid, the Message-ID must be included in the appropriate
header field.
Email Tracing
Tracing emails is often done for forensic purposes, especially when investigating
cybercrimes or identifying spam and viruses. Since email headers can be spoofed, it's crucial
to understand their structure and limitations when tracing emails. In some cases, spam or
virus-generated emails may not provide reliable tracing information.
Network Forensics:
Network Forensics and Wireless Forensics are crucial disciplines within the broader field of
computer forensics. Network forensics deals with the monitoring, capturing, and analysis of
network traffic to uncover suspicious or illegal activity. The rise of open, unprotected Wi-Fi
networks presents a major security risk, as demonstrated by a survey revealing that 50% of
Wi-Fi connections in certain areas remained unprotected. This highlights the need for
network forensics professionals to understand wireless networks and the technology
surrounding them.
Wireless Forensics
Wireless forensics, a subset of network forensics, specifically focuses on the collection and
analysis of data from wireless networks. In 1997, Marcus Ranum coined the term "wireless
forensics," emphasizing the importance of studying wireless traffic to uncover network
anomalies, security breaches, and unauthorized activities. This field is especially relevant in
modern times with the widespread use of VoIP technologies over Wi-Fi, where evidence
might include both data and voice communications.
Key Aspects of Wireless Forensics
Traffic Capture: Wireless forensics involves capturing all data traveling over Wi-Fi
networks, which can include everything from web traffic to VoIP calls.
Analysis of Network Events: The goal is to identify any network anomalies, such as
unauthorized access, attacks, or breaches.
Security Attack Identification: By analyzing the traffic, forensic experts can
determine the source of security attacks and investigate their impact on the
network.
Evidence Preservation: Similar to traditional computer forensics, wireless forensics
requires the careful identification, preservation, and analysis of evidence in a way
that can be presented in court.
Challenges in Wireless Forensics
Wireless networks pose unique challenges for forensic experts, particularly in capturing and
analyzing Wi-Fi traffic. Factors such as signal range, encryption methods, and the dynamic
nature of wireless traffic make it harder to perform forensic activities compared to
traditional wired networks. Despite these challenges, it remains essential for network
forensics professionals to follow the same basic forensic principles:
1. Identify the evidence.
2. Preserve the integrity of the evidence.
3. Analyze the evidence impartially.
4. Report the findings in a legally acceptable format.
Solving a computer forensics case involves a series of methodical steps to ensure the
integrity of evidence and provide accurate results. Below is a summary of the key steps
involved in solving such a case:
1. Prepare for the Forensics Examination: Before beginning, ensure you have the
necessary tools, knowledge, and permissions to conduct the investigation.
2. Understand the Case: Talk to key stakeholders, such as law enforcement, clients, or
others involved in the case, to gather background information on what you are
looking for and the context surrounding the case.
3. Verify the Case: Ensure that the case has a solid foundation before proceeding. This
includes confirming that there is a legitimate reason for the investigation and
understanding the potential scope of the issue.
4. Assemble Tools for Data Collection: Gather the necessary forensic tools. This
includes software for imaging (e.g., EnCase, Sleuth Kit), write-blockers to prevent
altering the data, and hardware for storage and examination.
5. Identify the Target Media: Determine the exact media (e.g., hard drive, USB drive,
floppy disk) from which evidence will be collected. The target media may be the
computer system itself or any external storage devices associated with it.
6. Collect the Data: Create an exact, bit-by-bit copy (image) of the target media. This is
done using forensic imaging software. It's important to use write-blockers during this
process to avoid altering the original data. Also, be sure to check email records, as
they often contain valuable information.
7. Examine the Collected Evidence: Review the data from the image you’ve created.
Analyze files, metadata, logs, and any other relevant digital artifacts that could
provide insight into the case. Use appropriate forensic tools to assist in uncovering
open files, encrypted files, and other potentially hidden information.
8. Analyze the Evidence: Manually examine the storage media to uncover crucial
information. If the target system is running Windows, pay particular attention to the
registry, which can contain valuable data on user activity. Additionally, review
internet searches, emails, and images, as criminals often hide incriminating
information through methods like steganography (hiding data within images or other
files).
9. Report Findings: After thoroughly analyzing the data, prepare a detailed report that
documents your findings, the steps you took, and where specific pieces of evidence
were found. The report should be clear, objective, and suitable for use in legal
proceedings, if necessary.
7. Recuva
o A free data recovery tool for Windows that recovers accidentally deleted files
from hard drives and memory cards.
o Website: Recuva
8. Restoration
o A freeware recovery tool for Windows that helps recover deleted files from
hard drives.
o Website: Restoration
9. Undelete Plus
o A free file recovery tool compatible with all versions of Windows. It supports
FAT12/16/32, NTFS, and NTFS5 filesystems and can recover data from solid-
state devices.
o Website: Undelete Plus
10. R-Studio
o A data recovery software suite capable of recovering files from various file
systems, including FAT, NTFS, HFS, UFS, Ext2/Ext3 (Linux), and more.
o Website: R-Studio
11. Stellar Phoenix
o A suite of tools for recovering lost data from hard drives, including damaged
or corrupted partitions.
o Website: Stellar Phoenix
12. DeepSpar Disk Imager
o A dedicated disk imaging device designed to handle disk-level problems and
recover data from bad sectors on hard drives.
o Website: DeepSpar Disk Imager
13. Adroit Photo Recovery
o A specialized tool for recovering photos, including fragmented or corrupted
images. It supports high-definition RAW image recovery from cameras like
Canon and Nikon.
o Website: Adroit Photo Recovery
3. Carving Tools
Carving tools are used to recover fragmented or corrupted files by identifying and extracting
them based on file headers and footers. This technique is essential in forensics for retrieving
data that has been partially or completely erased but still exists in unallocated space.
These carving tools can recover data even if the file system metadata is damaged or missing,
which is particularly useful when working with fragmented files like emails, documents, or
images.
File Carving Tools
1. Datalifter Extractor Pro
o A file carving tool that runs on multiple threads to leverage modern
processor capabilities, helping recover files from raw disk images.
o Website: Datalifter Extractor Pro
2. Simple Carver Suite
o This suite includes a set of tools designed for data recovery, forensics
computing, and E-Discovery. It was initially designed for data recovery but
now includes features for file decoding, identification, and classification.
o Website: Simple Carver Suite
3. Foremost
o Foremost is a console-based file carving tool that recovers files based on their
headers, footers, and internal data structures. It is widely used for forensic
analysis.
o Website: Foremost
4. Scalpel
o Scalpel is a fast file carver that extracts files from raw image files or device
files by reading header and footer definitions. It works across various file
systems, including FAT, NTFS, and Ext2/3.
o Website: Scalpel
5. CarvFs
o A virtual file system (FUSE) that enables recursive, in-place carving of files
from raw data and EnCase images. It allows for zero-storage carving.
o Website: CarvFs
6. LibCarvPath
o A shared library that enables carving tools to perform zero-storage carving on
virtual files using the CarvFs system.
o Website: LibCarvPath
7. PhotoRec
o PhotoRec is a file recovery tool designed to recover lost files, including
videos, documents, archives, and pictures from hard drives, CD-ROMs, and
memory cards.
o Website: PhotoRec