1. I would like to know if the WorkerID assigned in the CSV file is unique to individual workers or not. What we're trying to do is track workers who have been doing a lot of our tasks. I think that the WorkerIDs are unique to every individual worker, but I'm not sure. 2. For other requesters that have submitted their own HITs, what metrics or criteria do you guys use for approving and rejecting a worker's submissions, especially for tasks that involve classifying images? We would like to know if there is a way for us to quality check results without manually going by each worker's submission one by one? If there isn't, then that would defeat the purpose of having external groundtruthing. There must be some way to automate it be it programmatically or other means, especially since we plan on submitting a batch of over 5000 tasks and have 3 workers per task working on it. It would be unreasonable for us to look over 15,000 submissions. 3. Is there an option to require workers to complete all the tasks before they get their submissions reviewed? I doubt this but I would like to make sure. My team was concerned about this because when we checked our results, we noticed that many workers would come in and work on one or two tasks and leave. We would like to know if there is a way for us to either require workers to complete all the tasks or something like that. My idea is that I can create a custom qualification criteria where if I see a worker has done quite a bit of tasks for us, he can do our future batches. Would increasing the general qualification criteria help too?