Skip to content

Add Parallelism getter property to Accelerator class#3703

Merged
S1ro1 merged 4 commits intohuggingface:mainfrom
WoosungMyung:add-rank-getter-to-accelerator
Aug 2, 2025
Merged

Add Parallelism getter property to Accelerator class#3703
S1ro1 merged 4 commits intohuggingface:mainfrom
WoosungMyung:add-rank-getter-to-accelerator

Conversation

@WoosungMyung
Copy link
Contributor

@WoosungMyung WoosungMyung commented Aug 2, 2025

This PR adds a @Property method for simply get parallelism rank in HF Accelerate.

Motivation:

  • To make it easier to retrieve the process rank in Distributed Learning using Accelerate.

Changes:

  • Added rank property in accelerator.py

This PR is from issue #3702 with S1ro1.
Let me know if there’s anything I should improve.

Thanks a lot for giving opportunity for this amazing PJ!

Signed-off-by: WoosungMyung <dntjd517@naver.com>
Signed-off-by: WoosungMyung <dntjd517@naver.com>
Returns the local rank for shard-based data parallelism (e.g., FSDP).
Raises an error if not enabled.
"""
if not self.parallelism_config or not self.parallelism_config.dp_shard_enabled:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we want to adapt this a little, sorry for wrong assumption yesterday. I'd say the best thing UX wise is: return the rank if enabled, return 0 if parallelism config is enabled but no parallelism {x} is enabled (in this case all ranks are esentially 0) and raise a RuntimeError if neither of these 2 met. Do you think that is reasonable?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@S1ro1
Thanks for your fast feedback! I think it makes sense that parallelism configuration is set & parallelism is not enable, it should return rank 0 for all process. I'd appreciate it if you could take a look and let me know your thoughts.

Signed-off-by: WoosungMyung <dntjd517@naver.com>
Copy link
Contributor

@S1ro1 S1ro1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, LGTM now. Congratulation on your first contribution! Just fixed style checks for you and will merge!

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@S1ro1 S1ro1 merged commit cb343c6 into huggingface:main Aug 2, 2025
25 checks passed
@WoosungMyung
Copy link
Contributor Author

@S1ro1
Thank you so much for open-minded feedback throughout my first PR. I really appreciate the constructive discussions and your welcoming attitude.

@S1ro1
Copy link
Contributor

S1ro1 commented Aug 2, 2025

@S1ro1
Thank you so much for open-minded feedback throughout my first PR. I really appreciate the constructive discussions and your welcoming attitude.

We thank you for the contribution and look forward to more 🤗 Pleasure to be of help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants