Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some minor refactoring of the general structure, and a new memory efficient active subspace algorithm #11

Merged
merged 51 commits into from
Oct 8, 2021
Merged
Changes from 1 commit
Commits
Show all changes
51 commits
Select commit Hold shift + click to select a range
7f303d3
try / excepting krylov solution failure for nonlinear problems
tomoleary Jul 22, 2021
2003f48
checking in on tom_dev
tomoleary Sep 7, 2021
a8902dc
updating drivers and so on to be compatible with hessianlearn
tomoleary Sep 7, 2021
b89cf56
rearranging things to sort better by paper
tomoleary Sep 8, 2021
a2bad5a
updating markdown readmes
tomoleary Sep 8, 2021
508dcee
small readme edit
tomoleary Sep 8, 2021
f7bb338
more efficient saving procedure
tomoleary Oct 1, 2021
8d5ae04
not sure what changed here
tomoleary Oct 3, 2021
eb3f240
Merge branch 'tom_dev' of github.com:hippylib/hippyflow into tom_dev
tomoleary Oct 3, 2021
0d42c68
updating
tomoleary Oct 4, 2021
4e423b8
updating data saving procedure
tomoleary Oct 5, 2021
7b702ab
updating some work on improving active subspace memory footprint
tomoleary Oct 5, 2021
8a24b56
working on memory efficient active subspace slowly
tomoleary Oct 6, 2021
7697b8a
a lot more refactoring of the E[JTJ] operator and so on for memory is…
tomoleary Oct 6, 2021
13871ca
updating
tomoleary Oct 7, 2021
6f4065d
updating and working towards continuous integration
tomoleary Oct 8, 2021
acb25bd
typo
tomoleary Oct 8, 2021
af8354b
updating unit tests
tomoleary Oct 8, 2021
c391b7b
missing python instruction
tomoleary Oct 8, 2021
ea21985
updating travis yml
tomoleary Oct 8, 2021
59901ad
updating yaml for travis
tomoleary Oct 8, 2021
75e77e2
updating travis yaml again trying to fix build issues related to clon…
tomoleary Oct 8, 2021
fdc30e6
issue importing hippyflow in the test
tomoleary Oct 8, 2021
a7c3ffe
more testing stuff
tomoleary Oct 8, 2021
33acd12
more things being resolved
tomoleary Oct 8, 2021
dde0fae
more ci issues
tomoleary Oct 8, 2021
8525980
updating readme and playing around more with unit tests
tomoleary Oct 8, 2021
2a1dcbb
updating readme and unit tests
tomoleary Oct 8, 2021
54d2c23
updating
tomoleary Oct 8, 2021
ef0cfca
updating
tomoleary Oct 8, 2021
bce9f5c
updating
tomoleary Oct 8, 2021
b944a8a
struggling to get the unit tests to work properly
tomoleary Oct 8, 2021
60219ae
ci issues
tomoleary Oct 8, 2021
39eedde
updating again
tomoleary Oct 8, 2021
d252cef
updating again
tomoleary Oct 8, 2021
c5a4835
updating again
tomoleary Oct 8, 2021
50b3b34
updating please work thanks
tomoleary Oct 8, 2021
3d7852c
one last try
tomoleary Oct 8, 2021
a6cb252
continuing issues with travis build
tomoleary Oct 8, 2021
e873bb4
library linking error is driving me crazy
tomoleary Oct 8, 2021
1a34abb
library linking error is driving me crazy
tomoleary Oct 8, 2021
8c40206
library linking error is driving me crazy
tomoleary Oct 8, 2021
b3620ee
library linking error is driving me crazy
tomoleary Oct 8, 2021
f650d2c
exploring all these different ways travis doesnt make any sense
tomoleary Oct 8, 2021
4415f33
exploring all these different ways travis doesnt make any sense
tomoleary Oct 8, 2021
1866b2b
exploring all these different ways travis doesnt make any sense
tomoleary Oct 8, 2021
c8d961a
exploring all these different ways travis doesnt make any sense
tomoleary Oct 8, 2021
3474f47
exploring all these different ways travis doesnt make any sense
tomoleary Oct 8, 2021
2087da0
finally got ci working
tomoleary Oct 8, 2021
56391db
Merge branch 'main' into tom_dev
tomoleary Oct 8, 2021
05ae644
adding citation
tomoleary Oct 8, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
updating some work on improving active subspace memory footprint
  • Loading branch information
tomoleary committed Oct 5, 2021
commit 7b702abd71c3879fadd08a29fbdb90709a3b6094
88 changes: 66 additions & 22 deletions hippyflow/modeling/activeSubspaceProjector.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,12 @@ def ActiveSubspaceParameterList():
parameters['double_loop_samples'] = [20, 'Number of samples used in double loop MC approximation']
parameters['verbose'] = [True, 'Boolean for printing']


parameters['initialize_samples'] = [False,'Boolean for the initialization of samples when\
many samples are allocated on one process ']
parameters['serialized_sampling'] = [False, 'Boolean for the serialization of sampling on a process\
to reduce memory for large problems']

parameters['observable_constructor'] = [None,'observable constructor function, assumed to take a mesh, and kwargs']
parameters['observable_kwargs'] = [{},'kwargs used when instantiating multiple local instances of observables']

Expand Down Expand Up @@ -76,6 +82,26 @@ def mult(self,x,y):
else:
y.axpy(1.,temp)

class SeriallySampledJTJOperator:
'''
Alterantive to SummedListOperator when memory is an issue for active subspace
'''
def __init__(self,observable,nsamples,communicator=None,average=True):
self.observable = observable
self.average = average
if communicator is None:
self.temp = None
else:
self.temp = dl.Vector(communicator)

self.u = self.observable.generate_vector(STATE)
self.m = self.observable.generate_vector(PARAMETER)

def MatMVMult(self,x,y):
for i in range(nsamples):
pass
pass



class ActiveSubspaceProjector:
Expand All @@ -88,7 +114,7 @@ class ActiveSubspaceProjector:
Input active subspace: :math:`JJ' = VS^2V^*`
"""
def __init__(self,observable, prior, mesh_constructor_comm = None ,collective = None,\
initialize_samples = False, parameters = ActiveSubspaceParameterList()):
parameters = ActiveSubspaceParameterList()):
"""
Constructor
- :code:`observable` - object that implements the observable mapping :math:`m -> q(m)`
Expand Down Expand Up @@ -131,16 +157,9 @@ def __init__(self,observable, prior, mesh_constructor_comm = None ,collective =
# and avoid the time consuming sample initialization.











if self.parameters['samples_per_process'] > 1:
# Here we allocate many copies of the observable if serialized_sampling is not True in the
# active subspace parameters
if self.parameters['samples_per_process'] > 1 and not self.parameters['serialized_sampling']:
assert self.parameters['observable_constructor'] is not None
for i in range(self.parameters['samples_per_process']-1):
new_observable = self.parameters['observable_constructor'](self.observable.problem.Vh[0].mesh(),**self.parameters['observable_kwargs'])
Expand All @@ -149,17 +168,21 @@ def __init__(self,observable, prior, mesh_constructor_comm = None ,collective =
self.noise = dl.Vector(self.mesh_constructor_comm)
self.prior.init_vector(self.noise,"noise")


self.us = None
self.ms = None
self.Js = None
if self.parameters['serialized_sampling']:
self.u = None
self.m = None
self.J = None
else:
self.us = None
self.ms = None
self.Js = None

# Draw a new sample and set linearization point.
if initialize_samples:
self.initialize_samples()
if self.parameters['initialize_samples']:
if not self.parameters['serialized_sampling']:
self._initialize_batched_samples()



self.d_GN = None
self.V_GN = None
self.d_GN_noprior = None
Expand All @@ -170,7 +193,7 @@ def __init__(self,observable, prior, mesh_constructor_comm = None ,collective =
self.U_NG = None


def initialize_samples(self):
def _initialize_batched_samples(self):
"""
This method initializes the samples from the prior used in sampling
"""
Expand Down Expand Up @@ -204,13 +227,29 @@ def initialize_samples(self):


def construct_input_subspace(self,prior_preconditioned = True):
if self.parameters['serialized_sampling']:
self._construct_input_subspace_serialized(prior_preconditioned = prior_preconditioned)
else:
self._construct_input_subspace_batched(prior_preconditioned = prior_preconditioned)

def _construct_input_subspace_serialized(self,prior_preconditioned = True):
"""
This method implements the input subspace constructor
-:code:`prior_preconditioned` - a Boolean to decide whether to include the prior covariance in the decomposition
The default parameter is True which is customary in active subspace construction
"""
pass



def _construct_input_subspace_batched(self,prior_preconditioned = True):
"""
This method implements the input subspace constructor
-:code:`prior_preconditioned` - a Boolean to decide whether to include the prior covariance in the decomposition
The default parameter is True which is customary in active subspace construction
"""
if self.Js is None:
self.initialize_samples()
self._initialize_batched_samples()

if self.parameters['verbose']:
print(80*'#')
Expand Down Expand Up @@ -258,13 +297,18 @@ def construct_input_subspace(self,prior_preconditioned = True):
axis_label = ['i',r'$\lambda_i$',\
r'Eigenvalues of $\mathbb{E}_{\nu}[C{\nabla} q^T {\nabla} q]$'+self.parameters['plot_label_suffix']], out_name = out_name)

def construct_output_subspace(self,prior_preconditioned = True):
if self.parameters['serialized_sampling']:
pass
else:
self._construct_output_subspace_batched(prior_preconditioned = prior_preconditioned)

def construct_output_subspace(self):
def _construct_output_subspace_batched(self):
"""
This method implements the output subspace constructor
"""
if self.Js is None:
self.initialize_samples()
self._initialize_batched_samples()

if self.parameters['verbose']:
print(80*'#')
Expand Down