multi-user workflow #330
rattevijay
started this conversation in
General
Replies: 1 comment
-
I'm assuming your questions are related to data stored in a parallel file system (e.g. Lustre), not an HDFS setup done via Magpie. Magpie generally has not assumed multiple users will write to the same dataset. It would be up to users to ensure the data will not be corrupted. It could be done via file locking for example. I know some users that simply made the input dataset read only via unix permissions and no users are allowed to write to it. As for question |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I had a couple of questions on how Magpie works. I could not find a response on GitHub site.
a. When multiple users run Magpie for ML/AI jobs how does Magpie ensure no data corruption occurs when two users process the same dataset?
b. Taking the same scenario described in (a) if the users need to look at data in a database instead of a file system, how do we ensure no data corruption?
I appreciate you providing some insights on how would you recommend multi-user workflows using Magpie.
Beta Was this translation helpful? Give feedback.
All reactions