Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
466 views
in Technique[技术] by (71.8m points)

git - R and version control for the solo data analyst

Many data analysts that I respect use version control. For example:

However, I'm evaluating whether adopting a version control system such as git would be worthwhile.

A brief overview: I'm a social scientist who uses R to analyse data for research publications. I don't currently produce R packages. My R code for a project typically includes a few thousand lines of code for data input, cleaning, manipulation, analyses, and output generation. Publications are typically written using LaTeX.

With regards to version control there are many benefits which I have read about, yet they seem to be less relevant to the solo data analyst.

  • Backup: I have a backup system already in place.
  • Forking and rewinding: I've never felt the need to do this, but I can see how it could be useful (e.g., you are preparing multiple journal articles based on the same dataset; you are preparing a report that is updated monthly, etc)
  • Collaboration: Most of the time I am analysing data myself, thus, I wouldn't get the collaboration benefits of version control.

There are also several potential costs involved with adopting version control:

  • Time to evaluate and learn a version control system
  • A possible increase in complexity over my current file management system

However, I still have the feeling that I'm missing something. General guides on version control seem to be addressed more towards computer scientists than data analysts.

Thus, specifically in relation to data analysts in circumstances similar to those listed above:

  1. Is version control worth the effort?
  2. What are the main pros and cons of adopting version control?
  3. What is a good strategy for getting started with version control for data analysis with R (e.g., examples, workflow ideas, software, links to guides)?
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I feel the answer to your question is a resounding yes- the benefits of managing your files with a version control system far outweigh the costs of implementing such a system.

I will try to respond in detail to some of the points you raised:

  • Backup: I have a backup system already in place.

Yes, and so do I. However, there are some questions to consider regarding the appropriateness of relying on a general purpose backup system to adequately track important and active files relating to your work. On the performance side:

  • At what interval does your backup system take snapshots?
  • How long does it take to build a snapshot?
  • Does it have to image your entire hard drive when taking a snapshot, or could it be easily told to just back up two files that just received critical updates?
  • Can your backup system show you, with pinpoint accuracy, what changed in your text files from one backup to the next?

And most importantly:

  • How many locations are the backups saved in? Are they in the same physical location as your computer?
  • How easy is it to restore a given version of a single file from your backup system?

For example, have a Mac and use Time Machine to backup to another hard drive in my computer. Time Machine is great for recovering the odd file or restoring my system if things get messed up. However it simply doesn't have what it takes to be trusted with my important work:

  • When backing up, Time Machine has to image the whole hard drive which takes a considerable amount of time. If I continue working, there is no guarantee that my file will be captured in the state that it was when I initiated the backup. I also may reach another point I would like to save before the first backup finishes.

  • The hard drive to which my Time Machine backups are saved is located in my machine- this makes my data vulnerable to theft, fire and other disasters.

With a version control system like Git, I can initiate a backup of specific files with no more effort that requesting a save in a text editor- and the file is imaged and stored instantaneously. Furthermore, Git is distributed so each computer that I work at has a full copy of the repository.

This amounts to having my work mirrored across four different computers- nothing short of an act of god could destroy my files and data, at which point I probably wouldn't care too much anyway.

  • Forking and rewinding: I've never felt the need to do this, but I can see how it could be useful (e.g., you are preparing multiple journal articles based on the same dataset; you are preparing a report that is updated monthly, etc)

As a soloist, I don't fork that much either. However, the time I have saved by having the option to rewind has single-handedly paid back my investment in learning a version control system many, many times. You say you have never felt the need to do this- but has rewinding any file under your current backup system really been a painless, feasible option?

Sometimes the report just looked better 45 minutes, an hour or two days ago.

  • Collaboration: Most of the time I am analysing data myself, thus, I wouldn't get the collaboration benefits of version control.

Yes, but you would learn a tool that may prove to be indispensable if you do end up collaborating with others on a project.

  • Time to evaluate and learn a version control system

Don't worry too much about this. Version control systems are like programming languages- they have a few key concepts that need to be learned and the rest is just syntactic sugar. Basically, the first version control system you learn will require investing the most time- switching to another one just requires learning how the new system expresses key concepts.

Pick a popular system and go for it!

  • A possible increase in complexity over my current file management system

Do you have one folder, say Projects that contains all the folders and files related to your data analysis activities? If so then slapping version control on it is going to increase the complexity of your file system by exactly 0. If your projects are strewn about your computer- then you should centralize them before applying version control and this will end up decreasing the complexity of managing your files- that's why we have a Documents folder after all.

  1. Is version control worth the effort?

Yes! It gives you a huge undo button and allows you to easily transfer work from machine to machine without worrying about things like losing your USB drive.

2 What are the main pros and cons of adopting version control?

The only con I can think of is a slight increase in file size- but modern version control systems can do absolutely amazing things with compression and selective saving so this is pretty much a moot point.

3 What is a good strategy for getting started with version control for data analysis with R (e.g., examples, workflow ideas, software, links to guides)?

Keep files that generate data or reports under version control, be selective. If you are using something like Sweave, store your .Rnw files and not the .tex files that get produced from them. Store raw data if it would be a pain to re-acquire. If possible, write and store a script that acquires your data and another that cleans or modifies it rather than storing changes to raw data.

As for learning a version control system, I highly recommend Git and this guide to it.

These websites also have some nice tips and tricks related to performing specific actions with Git:


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...