Skip to content

Fix: include root module in parameter calculation#116

Open
Yui-Arthur wants to merge 1 commit intoultralytics:mainfrom
Yui-Arthur:fix-single-layer-params
Open

Fix: include root module in parameter calculation#116
Yui-Arthur wants to merge 1 commit intoultralytics:mainfrom
Yui-Arthur:fix-single-layer-params

Conversation

@Yui-Arthur
Copy link

@Yui-Arthur Yui-Arthur commented Feb 18, 2026

I have read the CLA Document and I sign the CLA

Problem

When passing a single layer (e.g., nn.Conv2d) directly to profile(), the function reports 0 parameters. This happens because the parameter count of the root module itself was ignored during the dfs process.
(Fixes #115)

Changes

  • Initialized total_params with module.total_params in dfs_count() to ensure the root module's parameters are included.

Minimal reproducible example (Issue #115)

from thop import profile
import torch
import torch.nn as nn

class test_model(nn.Module):
    def __init__(self):
        super().__init__()
        self.layer = nn.Conv2d(3, 10, (3,3))

    def forward(self, input):
        return self.layer(input)


layer = nn.Conv2d(3, 10, (3,3))
model = test_model()

macs, weight = profile(layer, torch.zeros(1,3,224,224))
print("Single Layer Profile Weights: ", weight)
# Single Layer Profile Weights:  280.0
macs, weight = profile(model, torch.zeros(1,3,224,224))
print("Model Profile Weights: ", weight)
# Model Profile Weights:  280.0

Before fix the single layer profile will got zero weights, after fix the profile can get correct weights

🛠️ PR Summary

Made with ❤️ by Ultralytics Actions

🌟 Summary

Fixes parameter counting in thop/profile.py by including the root module’s own parameters in the total. 🧮✅

📊 Key Changes

-Updates dfs_count() to initialize total_params from module.total_params.item() instead of 0.
-Makes parameter aggregation consistent with operation counting (which already includes the current module). 🔧

🎯 Purpose & Impact

-Prevents undercounting parameters for models where the top-level/root module has parameters. 🐛➡️✅
-Improves accuracy and trustworthiness of profiling outputs (params + ops) for end users running THOP profiling. 📈

Previously, passing a single layer (e.g., nn.Conv2d) resulted in 0 params because the params of root module was not added.
@github-actions
Copy link

github-actions bot commented Feb 18, 2026

All Contributors have signed the CLA. ✅
Posted by the CLA Assistant Lite bot.

@UltralyticsAssistant UltralyticsAssistant added bug Something isn't working fixed Bug has been resolved labels Feb 18, 2026
@UltralyticsAssistant
Copy link
Member

👋 Hello @Yui-Arthur, thank you for submitting a ultralytics/thop 🚀 PR! To ensure a seamless integration of your work, please review the following checklist:

-✅ Define a Purpose: Clearly explain the purpose of your fix or feature in your PR description, and link to any relevant issues. Ensure your commit messages are clear, concise, and adhere to the project's conventions.
-✅ Synchronize with Source: Confirm your PR is synchronized with the ultralytics/thop main branch. If it's behind, update it by clicking the 'Update branch' button or by running git pull and git merge main locally.
-✅ Ensure CI Checks Pass: Verify all Ultralytics Continuous Integration (CI) checks are passing. If any checks fail, please address the issues.
-✅ Update Documentation: Update the relevant documentation for any new or modified features.
-✅ Add Tests: If applicable, include or update tests to cover your changes, and confirm that all tests are passing.
-✅ Sign the CLA: Please ensure you have signed our Contributor License Agreement if this is your first Ultralytics PR by writing "I have read the CLA Document and I sign the CLA" in a new message.
-✅ Minimize Changes: Limit your changes to the minimum necessary for your bug fix or feature addition. "It is not daily increase but daily decrease, hack away the unessential. The closer to the source, the less wastage there is." — Bruce Lee

For more guidance, please refer to our Contributing Guide. This is an automated message, and an engineer will assist soon. 😊🚀

Copy link
Member

@UltralyticsAssistant UltralyticsAssistant left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔍 PR Review

Made with ❤️ by Ultralytics Actions

PR looks clean. The change correctly aligns parameter counting with ops counting by including the root module’s parameters. No issues found.

@Yui-Arthur
Copy link
Author

Yui-Arthur commented Feb 18, 2026

Thank you for your submission, we really appreciate it. Like many open-source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution. You can sign the CLA by just posting a Pull Request Comment same as the below format.

I have read the CLA Document and I sign the CLA

You can retrigger this bot by commenting recheck in this Pull Request. Posted by the CLA Assistant Lite bot.

I have read the CLA Document and I sign the CLA

recheck

@UltralyticsAssistant UltralyticsAssistant dismissed their stale review February 18, 2026 09:10

Superseded by new review

Copy link
Member

@UltralyticsAssistant UltralyticsAssistant left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔍 PR Review 2

Made with ❤️ by Ultralytics Actions

Review complete. No issues found in the diff; the change correctly includes root module parameters without introducing problems.

@UltralyticsAssistant UltralyticsAssistant dismissed their stale review February 18, 2026 09:12

Superseded by new review

Copy link
Member

@UltralyticsAssistant UltralyticsAssistant left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔍 PR Review 3

Made with ❤️ by Ultralytics Actions

Clean change that correctly includes root module parameters in the DFS count. No issues found.

@Yui-Arthur
Copy link
Author

I have read the CLA Document and I sign the CLA

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working fixed Bug has been resolved

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Passing a single layer (e.g., nn.Conv2d) resulted in 0 params

2 participants