"Fossies" - the Fresh Open Source Software Archive
Member "petsc-3.12.3/docs/changes/2018-21.html" (3 Jan 2020, 10698 Bytes) of package /linux/misc/petsc-3.12.3.tar.gz:
Caution: In this restricted "Fossies" environment the current HTML page may not be correctly presentated and may have some non-functional links.
You can here alternatively try to browse
the pure source code or just view
the uninterpreted raw source code. If the rendering is insufficient you may try to find and view the page on the petsc-3.12.3.tar.gz
project site itself.
Documentation: Changes: 2.0.18-2.0.21
NEW FEATURES and CHANGES in PETSc 2.0.18-2.0.21
Complex numbers performance upgrade: Added support for using
optimized Fortran kernels for some key complex numbers numerical
routines (such as matrix-vector products, vector norms, etc.) instead
of the default C++ routines. This implementation exploits the
maturity of Fortran compilers while retaining the identical user
interface. For example, on rs6000 machines, the base single-node
performance when using the Fortran kernels is 4-5 times faster than
the default C++ code.
Changed the names of various compiler flags, e.g., changed
PETSC_COMPLEX to USE_PETSC_COMPLEX.
PetscObjectCompose() since it really denotes
a "has-a" relationship, not an "is-a"
AO (Application Orderings):
Removed the MPI_Comm argument from
since it is contained in the IS arguments.
AOxxxToxxxXX() remapping routines will not
map negative entries in the input arrays. This allows, for example,
the mapping of neighbor lists that use negative entries to indicate
non-existent neighbors due to boundaries, etc.
TS (Timestepping Solvers):
- Added an interface to PVODE, the stiff integrator package of Hindmarsh et al.
SNES (Nonlinear Solvers):
Added support for using matrix colorings within finite difference
Jacobian approximations. See the section "Finite Difference
Jacobian Approximations" of the users manual for more details.
Also see the man pages for
- Fixed a bug in method SNES_EQ_NLS.
- Increased the default maximum number of function evaluations to 100000.
SLES (Linear Solvers):
KSP (Krylov Subspace Methods):
- Added the routine
Added -pc_lu_fill and -pc_ilu_fill to replace -mat_lu_fill and
-mat_ilu_fill; also added the commands
Added support for matrix colorings, which are intended primarily for use in finite
difference Jacobian approximations. See the SNES section above for more info. New routines
Added the matrix option
MatSetOption(mat,MAT_NEW_NONZERO_ALLOCATION_ERROR) that will cause an
error if a new entry that has not been preallocated is generated in
a sparse matrix. (currently implemented for AIJ and BAIJ matrices
only). This is a useful flag when debugging memory preallocation.
- Replaced the options -mat_lu_fill and -mat_ilu_fill with -pc_lu_fill and -pc_ilu_fill.
- Added the routine MatSetValuesBlockedLocal() for BAIJ matrices.
- Changed the final argument of MatGetTypeFromOptions() from type int* to PetscTruth*.
Added MatCreateSeqAdj() for supplying adjacency matrices to PETSc to
do reordering on (for example RCM to reduce bandwidth and thus get
better cache performance) and eventually partitioners.
MatSetLocalToGlobalMapping() and MatSetLocalToGlobalMappingBlocked()
now take a ISLocalToGlobalMapping object rather than a list of
Added the routine MatGetSubMatrix(), which extracts a parallel matrix
from a parallel matrix (currently implemented only for the MPIAIJ
DA (Distributed Arrays):
When used with the DA_STENCIL_STAR stencil, type, the routine
DAGetGlobalIndices() returns local-to-global mapping indices that now
include the inactive corner ghost nodes. This is useful, e.g., when
using MatSetValuesLocal() to set matrix elements, including corner
VecSetLocalToGlobalMapping() now takes a ISLocalToGlobalMapping
object rather than a list of indices.
- Added the routine VecCreateMPIWithArray().
Changed the calling sequence for VecCreateGhost(); added
VecCreateGhostWithArray(), VecGhostUpdate[Begin/End](), and
IS (Index Sets):
Added ISGlobalToLocalMappingApply() to allow one to convert lists
that are in the global numbering to a local numbering.
- Added a communicator as the first argument to ISLocalToGlobalMappingCreate().
- Added routines for drawing simple histograms. See DrawHistCreate().
- Removed the option -draw_x_private_colormap and made a private colormap the default.
Added the option -draw_x_shared_colormap to indicate not to use
a private colormap. If you use Netscape on your machine and are also
doing contour plots, you generally don't want to use a shared
- Improved the colors used in the contour plotting.
Changed some routine names:
- DrawText() to DrawString()
- DrawTextVertical() to DrawStringVertical()
- DrawTextSetSize() to DrawStringSetSize()
- DrawTextGetSize() to DrawStringGetSize()
- DrawSyncClear() to DrawSynchronizedClear()
- DrawSyncFlush() to DrawSynchronizedFlush()
- DrawSyncGetMouseButton() to DrawSynchronizedGetMouseButton().
- Added VIEWER_STDOUT_() and VIEWER_STDERR_().
- Added the routine OptionsClearValue().
Added the option -get_resident_set_size that causes the program to
call PetscGetResidentSetSize() at the end of the run and print how
much physical memory each process has used.
Changed OptionsGetProgramName() to PetscGetProgramName() and changed
the calling sequence to match PetscGetHostname(), etc.
- Changed BINARY_INT and BINARY_SCALAR to PETSC_INT and PETSC_SCALAR.
Added routines to map between C and Fortran representations of communicators
These provide the same functionality that
extern int MPICCommToFortranComm(MPI_Comm,int *);
extern int MPIFortranCommToCComm(int,MPI_Comm*);
do for PETSc objects.
extern int PetscCObjectToFortranObject(void *,int *);
extern int PetscFortranObjectToCObject(int,void *);
Removed the macros Double, DBLE, PetscDoubleExp as they are no longer
required. PETSc now compiles on the Cray T3D/T3E with the -dp option
that correctly handles Fortran code using double precision.
Added support for MatGetRow() MatRestoreRow() from Fortran; see
manpage for Fortran calling sequence.
Added PetscBinaryOpen(), PetscBinaryClose(), PetscBinaryRead() and
PetscBinaryWrite() for binary IO from Fortran; see
src/vec/examples/tests/ex20.F. Most users should not need this