Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[GraphBolt][DeprecationWarning]Deprecation warning resolved. #7443

Closed
wants to merge 1 commit into from

Conversation

drivanov
Copy link
Contributor

Description

The following DeprecationWarning is fixed:

  /usr/local/lib/python3.10/dist-packages/dgl/graphbolt/base.py:81: DeprecationWarning: torch.library.impl_abstract 
was renamed to torch.library.register_fake. Please use that instead; we will remove torch.library.impl_abstract in a 
future version of PyTorch.
    @torch.library.impl_abstract("graphbolt::expand_indptr")

Checklist

Please feel free to remove inapplicable items for your PR.

  • The PR title starts with [$CATEGORY] (such as [NN], [Model], [Doc], [Feature]])
  • I've leverage the tools to beautify the python and c++ code.
  • The PR is complete and small, read the Google eng practice (CL equals to PR) to understand more about small PR. In DGL, we consider PRs with less than 200 lines of core code change are small (example, test and documentation could be exempted).
  • To the best of my knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

Changes

Sorry, something went wrong.

@dgl-bot
Copy link
Collaborator

dgl-bot commented May 31, 2024

Not authorized to trigger CI. Please ask core developer to help trigger via issuing comment:

  • @dgl-bot

@dgl-bot
Copy link
Collaborator

dgl-bot commented May 31, 2024

Commit ID: 2fff0a3

Build ID: 1

Status: ❌ CI test failed in Stage [Authentication].

Report path: link

Full logs path: link

Copy link
Collaborator

@mfbalin mfbalin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://docs.google.com/document/d/1ZuCVyMfibExwvtzhd9cfMWk5zXT3Dhy1b3kuvAIkBoU/edit?usp=sharing

This guide says that pytorch 2.3 or older needs impl_abstract and pytorch > 2.3 needs to use the new way to register the operator. We support pytorch 2.1 or older so we need to fix it in a way that it will continue to work with older pytorch versions.

After the change, do the unit tests pass with pytorch 2.3 and pytorch nightlies?

@mfbalin
Copy link
Collaborator

mfbalin commented May 31, 2024

@dgl-bot

@dgl-bot
Copy link
Collaborator

dgl-bot commented May 31, 2024

Commit ID: 2fff0a3

Build ID: 2

Status: ❌ CI test failed in Stage [Torch CPU (Win64) Unit test].

Report path: link

Full logs path: link

@mfbalin
Copy link
Collaborator

mfbalin commented May 31, 2024

____ ERROR collecting tests/python/pytorch/dataloading/test_dataloader.py _____
tests\python\pytorch\dataloading\test_dataloader.py:6: in <module>
    import backend as F
tests\backend\__init__.py:8: in <module>
    from dgl.nn import *
python\dgl\__init__.py:16: in <module>
    from . import (
python\dgl\dataloading\__init__.py:13: in <module>
    from .dataloader import *
python\dgl\dataloading\dataloader.py:27: in <module>
    from ..distributed import DistGraph
python\dgl\distributed\__init__.py:5: in <module>
    from .dist_graph import DistGraph, DistGraphServer, edge_split, node_split
python\dgl\distributed\dist_graph.py:11: in <module>
    from .. import backend as F, graphbolt as gb, heterograph_index
python\dgl\graphbolt\__init__.py:39: in <module>
    from .base import *
python\dgl\graphbolt\base.py:81: in <module>
    @torch.library.register_fake("graphbolt::expand_indptr")
E   AttributeError: module 'torch.library' has no attribute 'register_fake'

@mfbalin
Copy link
Collaborator

mfbalin commented Jun 29, 2024

#7493 has the fix that continues to work with older pytorch.

@mfbalin mfbalin closed this Jun 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants