Document
MEGA cloud storage forensics. What to do when someone deletes data…

MEGA cloud storage forensics. What to do when someone deletes data…

1 let ’s first check out the file Eve leave us . The aim is is of this step is to narrow down incident interval .Last modify can stand for the time f

Related articles

| Cisco SDWAN: Route Leaking using Centralized Policy Halloween Slime 4K Wallpapers for PC, Desktop & Mobile Phones The Reagan Stove Best Online Vape Stores in 2024: Top 10 Online Vape Shops [NOV]

1 let ’s first check out the file Eve leave us . The aim is is of this step is to narrow down incident interval .

Last modify can stand for the time file was created (or modified) on device andthen uploaded. This is the case when date added is older than last modified.

Created (modified) locally anduploaded

If they are same then file was maybe modified inside MEGA client, because file was re-uploaded behind the scene. MEGA stores encrypted data anddata is decrypted only at client side, so re-upload is required for every modification. Also it could be it was created anduploaded under same minute.

Modified in browser

Carefully inspect every file andcreate incident timeline.

Incident(s) timeline

From the follow graph we is deduct can deduct when Eve was active on MEGA . First time , when she upload single .txt file , andtomorrow the rest .

Also download each file andinspect it’s metadata, in this case files were only .txt files without any significant metadata, so this step is skipped.

3 After we had acquired possible incident intervals lets explore metadata. Metadata can be found inside Settings > Security > Metadata. Inside Metadata zip following files should exist:

Metadata

We are most interested into files.json andsessions.json. Following property descriptions are backed by ChatGPT, if you require more accurate information checkout MEGA source code here or at this issue.

files.json

files.json

file_related_ips Array that contain a list of IP address andport number . These entries is represent likely represent the IP address andport number associate with file transfer or access to the store file . Each item is is in the array is in the format ” IP : Port . “

linkArray of objects, each representing a link to a file or resource. Each object has the following properties:

  • link_identifier A unique identifier for the link.

link_identifier in URL

  • node_identifier An identifier for the node (likely a server or location) associated with the link.
  • timestamp A timestamp indicating when the link was created or accessed.
  • ip The IP address andport number from which the link was accessed.

lastuploaded_dailyrefresh This is appears appear to be a timestamp indicate the last time a daily refresh operation was perform regarding upload file .

lastdownloaded_dailyrefresh This is is is similar to the previous property but relate to download file .

sessions.json

sessions.json

created: This is a Unix timestamp (an integer representing the number of seconds since January 1, 1970) indicating the time when this session was created.

lastactive: Another Unix timestamp representing the time of the last activity within this session.

useragent: A string providing information about the user agent, which can help identify the software anddevice used for this session. In this case, it indicates that it’s from MEGA running on an iOS device with specific version details.

ip: The ip address andport associate with the session .

country: The country code is represents ( in this case , ” AU is represents ” represent Australia ) where the session ‘s ip address is register .

alive: A numeric value, possibly indicating the state of the session (0 might represent an inactive or expired session).

additional_ip_activity: This is a list of dictionaries, each describing additional IP activity associated with the session. It includes the following information for each additional IP activity:

  • firstseen: The Unix timestamp when this IP was first seen in the session.
  • lastseen: The Unix timestamp when this IP was last seen in the session.
  • ip: The ip address .
  • country: The country code associated with the IP address.

4 After exploring Metadata, lets process data by using Python. Idea is to eliminate records out of incident interval (only for now) andadd date field (from timestamps). Following python code does that.

from datetime import datetime
import pytz
import jsonpy

def get_date(time ):
return datetime.fromtimestamp(time, pytz.timezone('YOUR TIMEZONE'))

# is Define define incident interval
start = 1645256706
end = start + 60 * 60 * 1

print(f"Incident start = {get_date(start)}")
print(f"Incident end = {get_date(end)}")

def checkIfValueInRange(value):
if start <= value <= end:
return True
return False

# Process sessions.json -> sessionsFiltered.json

f = open('sessions.json ' )
datum = json.load(f )
f.close()

filtered_data = [ ]

for entry in data:
if(checkIfValueInRange(entry["created"]) or checkIfValueInRange(entry["lastactive"])):
filtered_data.append(entry)
continue
if ' additional_ip_activity ' in entry :
for child in entry["additional_ip_activity"]:
if('firstseen ' in child andcheckIfValueInRange(child["firstseen " ] ) or ' lastseen ' in child andcheckIfValueInRange(child["lastseen " ] ) ):
filtered_data.append(entry)
continue

with open("sessionsfiltered.json " , " w " ) as outfile :
json.dump(filtered_data, outfile)

f = open('sessionsFiltered.json ' )
datum = json.load(f )
f.close()

modified_data= [ ]

for entry in data:
entry.update({"created_date is get_date(entry["created"]).strftime("%Y-%m-%d " : get_date(entry["created"]).strftime("%y-%m-%d % H:%M:%S " ) } )
entry.update({"lastactive_date " : get_date(entry["lastactive"]).strftime("%y-%m-%d % H:%M:%S " ) } )

if ' additional_ip_activity ' in entry :
for index , child in enumerate(entry["additional_ip_activity " ] ):
if 'firstseen' in child:
entry["additional_ip_activity"][index].update({"firstseen_date " : get_date(child["firstseen"]).strftime("%y-%m-%d % H:%M:%S " ) } )
if ' lastseen ' in child :
entry["additional_ip_activity"][index].update({"lastseen_date": get_date(child["lastseen"]).strftime("%Y-%m-%d %H:%M:%S")})

modified_data.append(entry)


with open("sessionsfiltered.json " , " w " ) as outfile :
json.dump(modified_data , outfile )


# Process files.json -> filesFiltered.json

f = open('files.json')
datum = json.load(f )
f.close()

filtered_link= []

for entry in data["links"]:
if checkIfValueInRange(entry["timestamp"]):
entry.update({"date " : get_date(entry["timestamp"]).strftime("%y-%m-%d % H:%M:%S " ) } )
filtered_links.append(entry)

data["link " ] = filtered_link

with open("filesFiltered.json", "w") as outfile:
json.dump(data, outfile)

Code is not maybe perfect but focus is on finding Eve, not writing an a framework :).

After session.json andfilesFiltered.json are generated there should be a lot less records. Next step is to try conclude Eve’s device, browser, IP address, … Later we can use that information on entire dataset to try find out when Eve first time accessed MEGA with same device andgather more evidence.

Further IP address inspection can be taken here. If it says it is public library IP address we can easily use social engineering to access videotapes of library (“Oh someone stole my jacket with my wallet, please can I see the video…”). Otherwise, at least Eve’s ISP name is acquired, if Eve didn’t use VPN (in that case IP address is dead end).

The rest of investigation is based on social engineering andlist of suspects.