Skip to content

Commit f79d3e3

Browse files
authored
Merge pull request #6 from mrava87/master
Prepare for v0.1.0
2 parents 501dd04 + 800a4e9 commit f79d3e3

File tree

3 files changed

+84
-0
lines changed

3 files changed

+84
-0
lines changed

CHANGELOG.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,16 @@
1+
# 0.1.0
2+
3+
* Added ``pylops_distributed.Restriction`` operator
4+
* Added ``pylops_distributed.signalprocessing.Convolve1D``
5+
and ``pylops_distributed.signalprocessing.FFT2D`` operators
6+
* Improved efficiency of
7+
``pylops_distributed.signalprocessing.Fredholm1`` when
8+
``saveGt=False``
9+
* Adapted ``pylops_distributed.optimization.cg.cg`` and
10+
``pylops_distributed.optimization.cg.cgls`` solvers for
11+
complex numbers
12+
13+
14+
115
# 0.0.0
216
* First official release.

docs/source/changelog.rst

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,23 @@
33
Changelog
44
=========
55

6+
7+
Version 0.1.0
8+
-------------
9+
10+
*Released on: 09/02/2020*
11+
12+
* Added :py:class:`pylops_distributed.Restriction` operator
13+
* Added :py:class:`pylops_distributed.signalprocessing.Convolve1D`
14+
and :py:class:`pylops_distributed.signalprocessing.FFT2D` operators
15+
* Improved efficiency of
16+
:py:class:`pylops_distributed.signalprocessing.Fredholm1` when
17+
``saveGt=False``
18+
* Adapted :py:func:`pylops_distributed.optimization.cg.cg` and
19+
:py:func:`pylops_distributed.optimization.cg.cgls` solvers for
20+
complex numbers
21+
22+
623
Version 0.0.0
724
-------------
825

Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
from pylops import Diagonal as pDiagonal
2+
from pylops_distributed import LinearOperator
3+
4+
5+
class Diagonal(LinearOperator):
6+
r"""Diagonal operator.
7+
8+
Applies element-wise multiplication of the input vector with the vector
9+
``diag`` in forward and with its complex conjugate in adjoint mode.
10+
11+
This operator can also broadcast; in this case the input vector is
12+
reshaped into its dimensions ``dims`` and the element-wise multiplication
13+
with ``diag`` is perfomed on the direction ``dir``. Note that the
14+
vector ``diag`` will need to have size equal to ``dims[dir]``.
15+
16+
Parameters
17+
----------
18+
diag : :obj:`dask.array.ndarray`
19+
Vector to be used for element-wise multiplication.
20+
dims : :obj:`list`, optional
21+
Number of samples for each dimension
22+
(``None`` if only one dimension is available)
23+
dir : :obj:`int`, optional
24+
Direction along which multiplication is applied.
25+
compute : :obj:`tuple`, optional
26+
Compute the outcome of forward and adjoint or simply define the graph
27+
and return a :obj:`dask.array.array`
28+
todask : :obj:`tuple`, optional
29+
Apply :func:`dask.array.from_array` to model and data before applying
30+
forward and adjoint respectively
31+
dtype : :obj:`str`, optional
32+
Type of elements in input array.
33+
34+
Attributes
35+
----------
36+
shape : :obj:`tuple`
37+
Operator shape
38+
explicit : :obj:`bool`
39+
Operator contains a matrix that can be solved explicitly (``True``) or
40+
not (``False``)
41+
42+
Notes
43+
-----
44+
Refer to :class:`pylops.basicoperators.Diagonal` for implementation
45+
details.
46+
47+
"""
48+
def __init__(self, diag, dims=None, dir=0,
49+
compute=(False, False), todask=(False, False),
50+
dtype='float64'):
51+
Op = pDiagonal(diag, dims=dims, dir=dir, dtype=dtype)
52+
super().__init__(Op.shape, Op.dtype, Op, explicit=False,
53+
compute=compute, todask=todask)

0 commit comments

Comments
 (0)