pystiche.loss.functional

pystiche.loss.functional.mrf_loss(input, target, eps=1e-08, reduction='mean', batched_input=None)

Calculates the MRF loss. See pystiche.loss.MRFLoss for details.

Parameters
  • input (Tensor) – Input of shape \(\newcommand{\parentheses}[1]{\left( #1 \right)} \newcommand{\brackets}[1]{\left[ #1 \right]} \newcommand{\mean}[1][]{\overline{\sum #1}} \newcommand{\fun}[2]{\text{#1}\of{#2}} \newcommand{\of}[1]{\parentheses{#1}} \newcommand{\dotproduct}[2]{\left\langle #1 , #2 \right\rangle} \newcommand{\openinterval}[2]{\parentheses{#1, #2}} \newcommand{\closedinterval}[2]{\brackets{#1, #2}} B \times S_1 \times N_1 \times \dots \times N_D\).

  • target (Tensor) – Target of shape \(\newcommand{\parentheses}[1]{\left( #1 \right)} \newcommand{\brackets}[1]{\left[ #1 \right]} \newcommand{\mean}[1][]{\overline{\sum #1}} \newcommand{\fun}[2]{\text{#1}\of{#2}} \newcommand{\of}[1]{\parentheses{#1}} \newcommand{\dotproduct}[2]{\left\langle #1 , #2 \right\rangle} \newcommand{\openinterval}[2]{\parentheses{#1, #2}} \newcommand{\closedinterval}[2]{\brackets{#1, #2}} B \times S_2 \times N_1 \times \dots \times N_D\).

  • eps (float) – Small value to avoid zero division. Defaults to 1e-8.

  • reduction (str) – Reduction method of the output passed to pystiche.misc.reduce(). Defaults to "mean".

  • batched_input (Optional[bool]) – If False, treat the first dimension of the inputs as sample dimension, i.e. \(\newcommand{\parentheses}[1]{\left( #1 \right)} \newcommand{\brackets}[1]{\left[ #1 \right]} \newcommand{\mean}[1][]{\overline{\sum #1}} \newcommand{\fun}[2]{\text{#1}\of{#2}} \newcommand{\of}[1]{\parentheses{#1}} \newcommand{\dotproduct}[2]{\left\langle #1 , #2 \right\rangle} \newcommand{\openinterval}[2]{\parentheses{#1, #2}} \newcommand{\closedinterval}[2]{\brackets{#1, #2}} S \times N_1 \times \dots \times N_D\). Defaults to True. See pystiche.cosine_similarity() for details.

Examples

>>> import pystiche.loss.functional as F
>>> input = torch.rand(1, 256, 64, 3, 3)
>>> target = torch.rand(1, 128, 64, 3, 3)
>>> score = F.mrf_loss(input, target, batched_input=True)
Return type

Tensor

pystiche.loss.functional.total_variation_loss(input, exponent=2.0, reduction='mean')

Calculates the total variation loss. See pystiche.ops.TotalVariationOperator for details.

Parameters
  • input (Tensor) – Input image

  • exponent (float) – Parameter \(\newcommand{\parentheses}[1]{\left( #1 \right)} \newcommand{\brackets}[1]{\left[ #1 \right]} \newcommand{\mean}[1][]{\overline{\sum #1}} \newcommand{\fun}[2]{\text{#1}\of{#2}} \newcommand{\of}[1]{\parentheses{#1}} \newcommand{\dotproduct}[2]{\left\langle #1 , #2 \right\rangle} \newcommand{\openinterval}[2]{\parentheses{#1, #2}} \newcommand{\closedinterval}[2]{\brackets{#1, #2}} \beta\) . A higher value leads to more smoothed results. Defaults to 2.0.

  • reduction (str) – Reduction method of the output passed to pystiche.misc.reduce(). Defaults to "mean".

Examples

>>> import pystiche.loss.functional as F
>>> input = torch.rand(2, 3, 256, 256)
>>> score = F.total_variation_loss(input)
Return type

Tensor

pystiche.loss.functional.value_range_loss(input, min=0.0, max=1.0, reduction='mean')
Return type

Tensor