Comments (8)
This behavior should be the results of our functionalization pass. @alanwaketan to confirm the expected behavior. Either way, let's have a dlpack
documentation/tutorial that goes through example use cases and fully explains correct behavior @ysiraichi.
from xla.
Thanks for the issue. I checked buffer pointer at more places:
>>> t0 = torch.arange(10, device=xm.xla_device())
>>> xm.mark_step(wait=True)
>>>
>>> capsule = xdlpack.to_dlpack(t0)
>>> t1 = xdlpack.from_dlpack(capsule)
>>> print(torch_xla._XLAC._unsafe_buffer_pointer(t0)== torch_xla._XLAC._unsafe_buffer_pointer(t1))
True
>>>
>>> t0[0] = 100
>>> xm.mark_step()
>>>
>>> print(torch_xla._XLAC._unsafe_buffer_pointer(t0)== torch_xla._XLAC._unsafe_buffer_pointer(t1))
True
>>> print(t0.eq(t1).all().item())
False
>>>
>>> print(torch_xla._XLAC._unsafe_buffer_pointer(t0)== torch_xla._XLAC._unsafe_buffer_pointer(t1))
False
Could you elaborate on That's because even though functionalization emulates views and mutation, PyTorch/XLA doesn't really have the concept of views and can't mutate a given tensor.
? Do you mean when we do t0[0]=100
, the underlying pjrt buffer is not mutated hence t1
is not updated, even though t0 and t1 share the same storage? Let me also look into what torch_xla does when we do t0[0]=100
from xla.
Yes, exactly. In summary, functionalized lazy tensors is composed of:
Tensor(
impl=FunctionalTensorWrapper(
value=Tensor(
impl=XLATensorImpl(
tensor=XLATensor(handle or tensor_data or ir_value)
)
)
)
)
Suppose t0
and t1
share the same storage using the DLPack API. Whenever an in-place operation is called, e.g. t0.add_(1)
, the functionalization layer actually calls the functional variant (XLANativeFunctions::add
), which generates a new XLATensor
. Later, that is wrapped by a new FunctionalTensorWrapper
(let's call it temp
). In the end, the functionalization layer replaces the FunctionalTensorWrapper::value
of t0
by the one inside temp
. Thus, t0
ends up with the updated value, while t1
remains with the old one.
from xla.
Try this: https://github.com/pytorch/xla/blob/master/torch_xla/csrc/aten_xla_type.cpp#L2703
from xla.
Hmm. Not sure I get it. Could you explain a bit more?
from xla.
That's a helper where we can bridge information through intermediate tensors created by functionalization for in-place ops.
from xla.
When we do the in-place op t0[0] = 100
, I see XLANativeFunctions::_propagate_xla_data
invoked twice by:
- at::functionalization::fill__Scalar
- at::functionalization::copy_
in sequence. So it seems the help is already being used?
from xla.
Related Issues (20)
- Try running inference on an ARM CPU HOT 4
- Add a table on hardware compatability HOT 3
- Update diagrams to work with dark mode HOT 1
- Test export HLO instructions HOT 7
- Add example for training small LLM HOT 3
- Select a model to train and run on TPUs HOT 10
- How do I know which pytorch parameter corresponds to which parameter in hlo ir HOT 9
- Distributed spmd training with multiple compilations HOT 4
- RuntimeError: isDifferentiableType(variable.scalar_type()) INTERNAL ASSERT FAILED when using torch.repeat HOT 2
- [RFC] PR Cherrypicking Process After a Release Branch Cut HOT 1
- Incomplete Checkpoints for Non-Sharded Parameters During SPMD Training in PyTorch XLA HOT 4
- Delete main branch HOT 4
- TPU Initialization Failed HOT 3
- How to convert hlo.pb to hlo text? HOT 3
- The combination of inplace ops and custom op resulted in incorrect results HOT 3
- 2.4 backport PR request list HOT 15
- [torchbench] `drq` training fails to run on non-dynamo. HOT 1
- [RFC] PyTorch/XLA eager mode as default HOT 3
- [RFC] torch_xla2 dynamo integration HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from xla.