Comments (1)
Hi,
To highlight the difference between the two losses, as written in the paper:
Unlike our boundary loss, computing
$D_S$ cannot be done in a single step before training. The distance needs to be re-computed at each epoch during training, for all the images. It also requires to store the whole volume$\Omega$ in memory, as we cannot compute the distance map for only a subset of$\Omega$ . These might be important computational and memory limitations, more so when dealing with large images, as is the case for 3D distance maps.
Implementing their loss becomes much more complex: while our loss is simply a pixel-wise multiplication between the softmax and pre-computed distance map, [20] requires more work (which takes time and slows the training down).
class BoundaryLoss():
def __init__(self, **kwargs):
# Self.idc is used to filter out some classes of the target mask. Use fancy indexing
self.idc: List[int] = kwargs["idc"]
self.nd: str = kwargs["nd"]
print(f"Initialized {self.__class__.__name__} with {kwargs}")
def __call__(self, probs: Tensor, dist_maps: Tensor, _: Tensor, __, ___) -> Tensor:
assert simplex(probs)
assert not one_hot(dist_maps)
pc = probs[:, self.idc, ...].type(torch.float32)
dc = dist_maps[:, self.idc, ...].type(torch.float32)
multipled = einsum(f"bk{self.nd},bk{self.nd}->bk{self.nd}", pc, dc)
loss = multipled.mean()
return loss
class HausdorffLoss():
"""
Implementation heavily inspired from https://github.com/JunMa11/SegWithDistMap
"""
def __init__(self, **kwargs):
# Self.idc is used to filter out some classes of the target mask. Use fancy indexing
self.idc: List[int] = kwargs["idc"]
self.nd: str = kwargs["nd"]
print(f"Initialized {self.__class__.__name__} with {kwargs}")
def __call__(self, probs: Tensor, target: Tensor, _: Tensor, __, ___) -> Tensor:
assert simplex(probs)
assert simplex(target)
assert probs.shape == target.shape
B, K, *xyz = probs.shape # type: ignore
pc = cast(Tensor, probs[:, self.idc, ...].type(torch.float32))
tc = cast(Tensor, target[:, self.idc, ...].type(torch.float32))
assert pc.shape == tc.shape == (B, len(self.idc), *xyz)
target_dm_npy: np.ndarray = np.stack([one_hot2hd_dist(tc[b].cpu().detach().numpy())
for b in range(B)], axis=0)
assert target_dm_npy.shape == tc.shape == pc.shape
tdm: Tensor = torch.tensor(target_dm_npy, device=probs.device, dtype=torch.float32)
pred_segmentation: Tensor = probs2one_hot(probs).cpu().detach()
pred_dm_npy: np.nparray = np.stack([one_hot2hd_dist(pred_segmentation[b, self.idc, ...].numpy())
for b in range(B)], axis=0)
assert pred_dm_npy.shape == tc.shape == pc.shape
pdm: Tensor = torch.tensor(pred_dm_npy, device=probs.device, dtype=torch.float32)
delta = (pc - tc)**2
dtm = tdm**2 + pdm**2
multipled = einsum(f"bk{self.nd},bk{self.nd}->bk{self.nd}", delta, dtm)
loss = multipled.mean()
return loss
i do not find the boundary loss in your code. the loss used in 3D image segmentation is also time consuming
Indeed the latest version of the code has not been uploaded, I will do that very soon.
from boundary-loss.
Related Issues (20)
- Does einsum really make the code easier to understand HOT 2
- ISLES 2018 HOT 1
- Heterogeneous resolution yields non-zero boundary. HOT 5
- InvalidArgumentError: required broadcastable shapes at loc(unknown) [Op:Mul] HOT 2
- Can this loss be used for multi-label classification? HOT 4
- Create dist_map for image segmentation mask as label. HOT 2
- Is multiplication by negmask in one_hot2dist() irrelevant? HOT 2
- Question about the optional argument resolution in the dist_map_transform function HOT 1
- About the calculation of dist_map HOT 5
- how to use with sigmoid as activation function when meeting binary classification segmentation task HOT 3
- how to adjust the lambda parameter HOT 5
- How to use HausdorffLoss? HOT 1
- How to use HausdorffLoss? HOT 1
- How to one-hot encode a multi-class dataset and how to use Boundary Loss on B x N x W x H logits? HOT 2
- Only using boundary loss leads to non convergence HOT 1
- Failure of matching datasets of WMH HOT 1
- Is it possible to train the Boundary Loss code on a GPU? HOT 1
- Whether this loss function can be applied to the partition of a hollow region, that is, a region with two boundaries HOT 2
- License Request
- zero question
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from boundary-loss.