this post was submitted on 14 May 2025
34 points (97.2% liked)

NASA's Perseverance Mars Rover

1718 readers
15 users here now

On the plains of Jezero, the secrets of Mars' past await us! Follow for the latest news, updates, pretty pics, and community discussion on NASA and the Jet Propulsion Laboratory's most ambitious mission to Mars!

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 1 week ago (1 children)

I, for one, really appreciate your efforts. I would never have begun thinking about this data if you hadn't been so consistent about posting it. Eventually I'm going to have to learn to extract the data myself, but right now I'm a little busy reading about the maturing science on this mission. I do tend to get a little over my head with projects like this sometimes, but I'm steadily working away at that abrasion patch guide, so that'll come first.

BTW - new abrasion patch! I've already posted about it!

[–] [email protected] 2 points 1 week ago (1 children)

I just saw the abrasion patch :) They did not waste time doing that :)

Extracting the drive data from JSON is very straightforward if you have access to any modern spreadsheet.

These days I don't have access to MS tools (Excel etc), so with the help of others I created a series of automatic 'Google Sheets' spreadsheets for M2020, MSL, & the Mars Helicopter, each spreadsheet automatically imported the most recent waypoint JSON for that mission whenever those spreadsheets were opened or refreshed, it simply treated the JSON data as comma separated verse (CSV) data and created a large table in the spreadsheet. Sadly the auto 'import JSON' function is a bespoke function that was specific to Google sheets.

Extracting data from the large table created from the CSV uses very basic formulas available in all spreadsheets.

Like a few others, I'm in the process of de-Googling my life. So sometime back I converted my spreadsheets for M2020 & MSL to LibreOffice 'Calc'. I've had them both working for many months already, but at this time 'Calc' does not come with a built-in 'Import JSON' function. So I simply copy the JSON data, by pasting it into an on-line tool that uses your own browser to convert the JSON to CSV, copy that output data, then paste that CSV into 'Calc' and my spreadsheet automatically creates the table I share an image of. It takes less than one minute to open the JSON, copy the data, open the converter and past in the raw JSON, copy the output raw data and paste it into 'Calc'

My spreadsheets are not copyright, I'm open to sharing them with anyone that want's them. Like any tool, there is a literal ton of data out there and many ways to report that data.

I've been out for a couple of hours, but when I got home I searched through the map path JSON that I don't normally use, to see if I could find the mid drive attitude data, so far no luck in that JSON, as all they report are the way-points throughout the drive, some drives have dozens of them that trace out the path of the particular drive. Sadly no attitude data in that JSON that I have found so far (still looking, before I ask friends for help). But I did find some special data that I believed was not shared with the public. I've been looking for this data since the mission started and typically I find it while looking for something else. I found the 'Spacecraft Clock Count' (SCLK) data for the 'start' and 'end' of each drive. It was like finding the proverbial pot of gold at the end of a rainbow! The SCLK counts are reported for each drive in seconds, so a simple formula gives me the exact duration of each drive. The duration of the drive on sol 1503 from site 73.954 to site 73.1522 was exactly 3549 seconds, or 59.15 minutes. All I need to do now is to simplify how to import that into my spreadsheet, and make a line entry for drive duration :)

I can export each spreadsheet as MS Excel format or open office format (ods) etc - Let me know if you want copies now or in the future :) If you decide to use them I can provide as much guidance as you need to get you going, once you see how easy they are to use, you'll no longer need any guidance :)

[–] [email protected] 2 points 4 days ago (1 children)

Thanks so much for the detailed reply, Paul! I'm looking forward to playing with this data. I'll let you know as soon as I'm ready.

[–] [email protected] 2 points 4 days ago

I’ll let you know as soon as I’m ready.

No problem! A very smart and kind user on Mastodon has just provided me with a link to the M20 traverse JSON. So I can now do away with several steps to get the start / stop times for each drive without having to extract the drive start / end times from the geojson from the map. Importing the start / end time is still manual, but it only takes a short time. From that data I can calculate drive time, speed (it's not quick a speedster :) ) but it may be of interest to some, I can also calculate the longest /shortest drive times, average drive times etc etc. Not sure yet what to use.

I'm hoping the 'Calc' team eventually creates an IMPORT JSON function, like the one that EXCEL has had for many years, once they do that, I can create a spreadsheet that automatically imports the data from several different JSONs after each drive and automatically populates the tables I share when the spreadsheet is opened. Until then, there will be some manual copy, convert and paste to do, but it literally takes an extra minute or two after the JSONs are published, and you're used to operating it.

Importing data manually is actually good practice, as you can see where the data comes from, and it's easier to fix in the event that JPL ever change the format of the JSONs. They did change the M20 JSON a couple of times (they added extra data fields) but the last time it was changed was a couple of years ago, the JSONs have been stable for quite some time. I didn't even notice they had changed it last time, as with the IMPORT JSON function I was using in my spreadsheet, I had specified which header fields I wanted to import, that option basically ignored the rest of the fields, only importing the fields that I needed at that time. So I did not notice the addition of the new fields, nor did their addition break my spreadsheet (a nice bonus being able to specify which fields you need, rather than importing all of them).

When you're ready for the spreadsheet, can let me know the browser you are using, and the spreadsheet package you intend to use, so I can tailor a set of instructions you can use to gather the data (JSON URLs etc) then copy, convert and drop it into the spreadsheet etc. When you open / load a JSON URL in different browsers, they can display the data with subtle variations, the LibreWolf Browser I'm using now is a little easier to use with JSONs :)