-
Notifications
You must be signed in to change notification settings - Fork 735
[ENH] Implementing the iTransformer model in PTFv2.
#1994
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
PranavBhatP
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR @JATAYU000 . I've dropped some comments on the PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Conventionally for v2, all the layers of the models' architecture are present in the layers directory. Many of the layers you are using here can be directly imported from this directory, I see a lot of commonality. I would suggest only adding new layers (if not already present in layers) as a subdirectory - layers/_<layer-type>. sub_modules.py is a v1 convention
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I found some changes in some layers and let it be in submodules for now in draft pr, will fix this .
| :, :, :N | ||
| ] # filter covariates | ||
|
|
||
| if self.use_norm: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.use_norm is not required in the model code since it will be handled by the D1/D2 layers. Normalization and denomalization need not be handled here. Simply return dec_out.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh Thank you for pointing that out.
| } | ||
|
|
||
| @classmethod | ||
| def get_test_train_params(cls): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add a few more test cases here?
| """ | ||
| An implementation of iTransformer model for v2 of pytorch-forecasting. | ||
|
|
||
| Parameters |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Docstring for model hyperparameters is missing.
@PranavBhatP The |
|
re layers, I would do as follows:
|
@fkiraly seems like a nice good first issue? |
Reference Issues/PRs
Fixes #1899
What does this implement/fix? Explain your changes.
Have started interfacing
iTransformerin PTFv2, from the TSlib repository thuml/iTransformerWork in progress, would like to have suggestions on it.
What should a reviewer concentrate their feedback on?
Did you add any tests for the change?
Not yet
Any other comments?
PR checklist
pre-commit install.To run hooks independent of commit, execute
pre-commit run --all-files