"Fossies" - the Fresh Open Source Software Archive

Member "pytorch-1.8.2/docs/source/nn.functional.rst" (23 Jul 2021, 8638 Bytes) of package /linux/misc/pytorch-1.8.2.tar.gz:


As a special service "Fossies" has tried to format the requested source page into HTML format (assuming markdown format). Alternatively you can here view or download the uninterpreted source code file. A member file download can also be achieved by clicking within a package contents listing on the according byte size field. See also the last Fossies "Diffs" side-by-side code changes report for "nn.functional.rst": 1.11.0_vs_1.12.0.

torch.nn.functional

torch.nn.functional

Convolution functions

conv1d

conv1d

conv2d

conv2d

conv3d

conv3d

conv_transpose1d

conv_transpose1d

conv_transpose2d

conv_transpose2d

conv_transpose3d

conv_transpose3d

unfold

unfold

fold

fold

Pooling functions

avg_pool1d

avg_pool1d

avg_pool2d

avg_pool2d

avg_pool3d

avg_pool3d

max_pool1d

max_pool1d

max_pool2d

max_pool2d

max_pool3d

max_pool3d

max_unpool1d

max_unpool1d

max_unpool2d

max_unpool2d

max_unpool3d

max_unpool3d

lp_pool1d

lp_pool1d

lp_pool2d

lp_pool2d

adaptive_max_pool1d

adaptive_max_pool1d

adaptive_max_pool2d

adaptive_max_pool2d

adaptive_max_pool3d

adaptive_max_pool3d

adaptive_avg_pool1d

adaptive_avg_pool1d

adaptive_avg_pool2d

adaptive_avg_pool2d

adaptive_avg_pool3d

adaptive_avg_pool3d

Non-linear activation functions

threshold

threshold

threshold

relu

relu

relu

hardtanh

hardtanh

hardtanh

hardswish

hardswish

relu6

relu6

elu

elu

elu

selu

selu

celu

celu

leaky_relu

leaky_relu

leaky_relu

prelu

prelu

rrelu

rrelu

rrelu

glu

glu

gelu

gelu

logsigmoid

logsigmoid

hardshrink

hardshrink

tanhshrink

tanhshrink

softsign

softsign

softplus

softplus

softmin

softmin

softmax

softmax

softshrink

softshrink

gumbel_softmax

gumbel_softmax

log_softmax

log_softmax

tanh

tanh

sigmoid

sigmoid

hardsigmoid

hardsigmoid

silu

silu

Normalization functions

batch_norm

batch_norm

instance_norm

instance_norm

layer_norm

layer_norm

local_response_norm

local_response_norm

normalize

normalize

Linear functions

linear

linear

bilinear

bilinear

Dropout functions

dropout

dropout

alpha_dropout

alpha_dropout

feature_alpha_dropout

feature_alpha_dropout

dropout2d

dropout2d

dropout3d

dropout3d

Sparse functions

embedding

embedding

embedding_bag

embedding_bag

one_hot

one_hot

Distance functions

pairwise_distance

pairwise_distance

cosine_similarity

cosine_similarity

pdist

pdist

Loss functions

binary_cross_entropy

binary_cross_entropy

binary_cross_entropy_with_logits

binary_cross_entropy_with_logits

poisson_nll_loss

poisson_nll_loss

cosine_embedding_loss

cosine_embedding_loss

cross_entropy

cross_entropy

ctc_loss

ctc_loss

hinge_embedding_loss

hinge_embedding_loss

kl_div

kl_div

l1_loss

l1_loss

mse_loss

mse_loss

margin_ranking_loss

margin_ranking_loss

multilabel_margin_loss

multilabel_margin_loss

multilabel_soft_margin_loss

multilabel_soft_margin_loss

multi_margin_loss

multi_margin_loss

nll_loss

nll_loss

smooth_l1_loss

smooth_l1_loss

soft_margin_loss

soft_margin_loss

triplet_margin_loss

triplet_margin_loss

triplet_margin_with_distance_loss

triplet_margin_with_distance_loss

Vision functions

pixel_shuffle

pixel_shuffle

pixel_unshuffle

pixel_unshuffle

pad

pad

interpolate

interpolate

upsample

upsample

upsample_nearest

upsample_nearest

upsample_bilinear

upsample_bilinear

grid_sample

grid_sample

affine_grid

affine_grid

DataParallel functions (multi-GPU, distributed)

data_parallel

torch.nn.parallel.data_parallel