Academia.eduAcademia.edu

Constraints versus penalties for edge-preserving full-waveform inversion

2017, The Leading Edge

Full-waveform inversion is challenging in complex geologic areas. Even when provided with an accurate starting model, the inversion algorithms often struggle to update the velocity model. Compared with other areas in applied geophysics, including prior information in full-waveform inversion is still in its relative infancy. In part, this is due to the fact that it is difficult to incorporate prior information that relates to geologic settings where strong discontinuities in the velocity model dominate, because these settings call for nonsmooth regularizations. We tackle this problem by including constraints on the spatial variations and value ranges of the inverted velocities, as opposed to adding penalties to the objective, which is more customary in mainstream geophysical inversion. By demonstrating the lack of predictability of edge-preserving inversion when the regularization is in the form of an added penalty term, we advocate the inclusion of constraints instead. Our examples ...

Released to public domain under Creative Commons license type BY (https://creativecommons.org/licenses/by/4.0). Copyright (c) 2016 SLIM group @ The University of British Columbia. Constraints versus penalties for edge-preserving full-waveform inversion Felix  J.  Herrmann SLIM University  of  British  Columbia Constraints versus penalties for edge-preserving full-waveform inversion Felix  J.  Herrmann  &  Bas  Peters Bas Peters and Felix J. Herrmann, “Constraints versus penalties for edge-preserving full-waveform inversion”. 2016. Submitted to TLE. SLIM University  of  British  Columbia Anagaw, A. Y., and Sacchi, M. D., 2011, Full waveform inversion with total variation regularization Epanomeritakis, I., Akçelik, V., Ghattas, O., and Bielak, J., 2008, A newton-CG method for large-scale three-dimensional elastic full-waveform seismic inversion: Inverse Problems, 24, 034015. Motivation Full-­‐waveform  inversion  (FWI): ‣ hampered  by  poor  data  &  parasitic  local  minima ‣ ill-­‐posed  <=>  missing  frequencies  &  finite  aperture ‣ should  benefit  from  bounds  &  structure-­‐promoting  priors  (TV-­‐  or    `    1  -­‐norms  ) Efforts  met  w/  limited  success: ‣ unpredictable  dependence  on  (unnecessary)  hyper  parameters ‣ poor  conditioning  of  structure  promoting  regularization   ‣ difficulties  handling  multiple  pieces  of  prior  information FWI Unconstrained  optimization  problem: objective 1 minimize f (m) = m∈Rm 2 N s X data modelling kF (m)qi 2 di k2 i=1 Local  derivative  information  is  used  to  update  the  model: mk+1 = mk iterate γrm f (mk ) gradient ‣ no  insurance  model  iterates  remain  (physically/geologically)  feasible ‣ no  mitigation  of  inversion  artifacts  by  controlling  model’s  complexity Stylized example Forward  model: d = F (c)q ≡ c ∗ q Unconstrained  inversion: 1 kF (c)q minimize c∈Rm 2 ‣ when  source  q  misses  low  frequencies ‣ w/o  regularization 2 dk2 Stylized example w/ constraints Regularization  via  constraints  on  model: 1 2 minimize kF (c)q dk2 subject to Dc c≥c0 2 ‣ minimal  velocity ‣ monotonic  increasing  gradient  of  the  velocity Leads  to  successful  recovery... 0 Inversion w/o constraints velocity data inversion data  fit 200 200 200 200 400 400 400 400 z [m] 0 z [m] 0 z [m] 0 z [m] 0 600 600 600 600 800 800 800 800 1000 1000 -1 1000 1000 -1 1400 1800 c [m/s] 2200 0 d 1 1400 1800 c [m/s] 2200 0 d 1 Inversion w/ constraints velocity data inversion data  fit 200 200 200 200 400 400 400 400 z [m] 0 z [m] 0 z [m] 0 z [m] 0 600 600 600 600 800 800 800 800 1000 1000 -1 1000 1000 -1 1400 1800 c [m/s] 2200 0 d 1 1400 1800 c [m/s] 2200 0 d 1 Tikhonov regularization Add  quadratic  penalty  terms: α β 2 2 minimize f (m) + kR1 mk + kR2 mk m 2 2 ‣ ‣ ‣ ‣ ‣ ‣ ‣ well-­‐known  &  successful  technique is  differentiable not  an  exact  penalty regularization  may  adversely  affect  gradient  &  Hessian requires  non-­‐trivial  choices  for  hyper  parameters not  easily  extended  to  edge-­‐preserving    `    1      -­‐  norms no  guarantees  that  all  model  iterates  are  regularized Andrey  Tikhonov  1906–1993 Regularization w/ constraints Add  multiple  constraints: \ minimize f (m) subject to m ∈ C1 C2 m ‣ ‣ ‣ ‣ ‣ ‣ ‣ not  well-­‐known  in  our  community requires  understanding  of  latest  optimization  techniques Jean  Jacques  Moreau does  not  affect  gradient  &  Hessian  1923–2014 easier  parameterization able  to  uniquely  project  onto  intersection  of  multiple  constraint  sets   constraints  do  not  need  to  be  differentiable constraints  are  satisfied  at  every  model  iterate Boyd, S., Parikh, N., Chu, E., Peleato, B., and Eckstein, J., 2011, doi:10.1561/2200000016 POCS vs. best approximation “Projection”-­‐onto-­‐convex-­‐sets  solves  convex  feasibility  problems:   \ find m ∈ C1 C2 Instead,  we  solve  convex  projection  problems: minimize km m 2 xk2 subject to m 2 C1 \ ‣ obtain  optimal  (also  feasible)  approximations   ‣ project  uniquely  w/  DYKSTRA  onto  intersections  of  convex  sets C2 Feasibility vs optimal start Optimal  (red)          -­‐-­‐>  unique  solution Feasible  (green)  -­‐-­‐>  depends  on  order Intersection  of   circle & square 3 2.8 3 2.8 y 2.6 y 2.6 2.4 2.4 2.2 Optimal 2 2.2 2 Feasible  set 12 start 1.8 1.8 Feasible  set POCS 2 2.2 x 2.4 2.6 1.8 1.8 2 2.2 x 2.4 2.6 Intersection  of   circle & square 3 3 2.8 2 2 mk 2 2.8 start mk 2 start Optimal  (red)          -­‐-­‐>  unique  solution Feasible  (green)  -­‐-­‐>  depends  on  order y 2.6 y 2.6 2.2 Optimal 2 2.2 2 Feasible  set 13 1.8 1.8 kx 2.4 kx 2.4 Feasible  set POCS 2 2.2 x 2.4 2.6 1.8 1.8 2 2.2 x 2.4 2.6 Our constraints bounds:                                                                                                where        m            ∈        Box                  means total-­‐variation  norm  ball: Proximal projection Find  a  model  m          ,  closest  to    x        ,  such  that  it  satisfies  the  constraints: \ 2 PC (x) = arg min km xk2 subject to m 2 C1 C2 m yields ‣ nonlinear  “minimal  complexity”  best  approximation  of  m              →              x          when      τ          →              τ      0        =            TV(x)                                    and   m ∈ Box ‣m ‣ edge  preserving Best approximations 0.15τ0 0.15 TV(m * ) 0.25τ0 0.5τ0 0.25 TV(m * ) 0.5 TV(m * ) 0.75τ0 0.75 TV(m * ) τ0 true model TV(m * ) FWI w/ non-differentiable penalties Nonlinear  least-­‐squares  objective  for  FWI  w/  TV: minimize kd obs d sim 2 (m)k + αTV(m) m                                  is  not  differentiable  so  no  access  to      rf                (m)                      and r f (m) ‣ TV(m) ‣ gradient-­‐descent/quasi-­‐Newton/(Gauss-­‐Newton)  solvers  need  local   2 derivate  information FWI w/ non-differentiable penalties Nonlinear  least-­‐squares  objective  for  FWI  w/  TV: minimize kd obs d sim 2 (m)k + αTV(m) m                                  is  not  differentiable  so  no  access  to      rf                (m)                      and r f (m) ‣ TV(m) ‣ gradient-­‐descent/quasi-­‐Newton/(Gauss-­‐Newton)  solvers  need  local   2 derivate  information FWI w/ non-differentiable constraints Possible  solution    ✏  -­‐smoothing  of  TV: q X 1 2 2 2 TV✏ (m) = (mi+1,j − mi,j ) + (mi,j+1 − mi,j ) + ✏ h ij Differentiable  objective  with  penalty-­‐parameter  α    :   minimize f (m) + αTV✏ (m) m Problem:  need  to  select  2  unintuitive  hyper  parameters 0.015 ||x|| 1 ϵ = 1e-2 ϵ = 1e-3 ϵ = 1e-4 0.01 Numerical example 1 0.005 0 -0.01 -0.005 0 0.005 smoothed singularity • FWI  w/  smoothed  TV-­‐penalty  &  box  constraints • data  w/  zero-­‐mean  random  noise, knoisek2 /ksignalk2 = 0.25 • starting  model  =  smoothed  true  model • frequency  batches  from  3  Hz  to  10  Hz 0.01 increased     “blockiness” α=10 7 , ϵ=10 -4 α=10 7 , ϵ=10 -3 α=10 7 , ϵ=10 -2 α=10 6 , ϵ=10 -4 α=10 6 , ϵ=10 -3 α=10 6 , ϵ=10 -2 α=10 5 , ϵ=10 -4 α=10 5 , ϵ=10 -3 α=10 5 , ϵ=10 -2 FWI  results using  smoothed  TV   for  various    ↵,              ✏       combinations     Constrained formulation Problem  statement: Our  approach:  solve  this  problem  directly. There  are  many  ways  to  solve  it. Algorithm design – wish list ‣ application  of  constraints  should  not  require  additional  expensive   gradient  &  objective  calculations ‣ updated  models  need  to  satisfy  all  constraints  after  each  iteration ‣ arbitrary  number  of  constraints  should  be  handled  as  long  as  their   intersection  is  non-­‐empty ‣ manual  tuning  of  parameters  should  be  limited  to  bare  minimum ‣ constraints  should  work  w/  black-­‐box  gradients  &  objectives Nested optimization strategy Constrained  optimization: via  3  levels  of  nested  optimization: 1. Projected  gradients  =  expensive  step 2. Dykstra’s  algorithm*   3. Projection  onto  each  set  separately  (closed  form  or  w/  ADMM) *  parameter  free Projected gradients Algorithm: mk+1 = PC (mk rm f (mk )) Constrained formulation Algorithm: mk+1 = PC (mk rm f (mk )) projection  onto  constraint  set PC (m) = arg min kx x gradient  step  (proposed  model) mk2 s.t. x2 p \ Ci . i=1 intersection  of  constraint  sets Constrained formulation Projection  onto  intersection: PC (m) = arg min kx mk2 x s.t. x2 p \ Ci . i=1 Typically  no  closed-­‐form  solution  -­‐>  use  Dykstra’s  algorithm. Requires: ‣ projections  onto  each  set  separately ‣ vector  additions Dykstra splitting Toy  example: find  projection  onto  intersection  of  circle  &  square POCS Dykstra 3 2.8 2.6 2.6 y 2.8 y Algorithm 1 Dykstra. x0 = m, p0 = 0, q0 = 0 For k = 0, 1, . . . yk = PC1 (xk + pk ) pk+1 = xk + pk − yk xk+1 = PC2 (yk + qk ) qk+1 = yk + qk − xk+1 End POCS Dykstra 3 2.4 2.4 2.2 2.2 2 2 1.8 1.8 1.8 2.61.8 2 Only  needs  projections  onto  each  set  separately! 2.2 x 2.4 2 2.2 x 2.4 2.6 Boyd, S., Parikh, N., Chu, E., Peleato, B., and Eckstein, J., 2011, doi:10.1561/2200000016 Individual projections Projection  onto  bounds:   PC1 (mi ) = median{li , mi , ui } ∀i Projection  onto  TV-­‐ball:    split  variables,  then  use  ADMM  for  x  &  z 1 PC (m) = arg min kx x 2 1 = arg min kx x 2 1 = arg min kx x 2 2 mk2 s.t x2C 2 mk2 s.t krxk1  σ 2 mk2 s.t kzk1  σ , rx = z Workflow model  update  direction gradient User  provided  code  for  FWI function  value Optimization  algorithm  which  handles  projections (projected  &  proximal  algorithms) vector projected vector Projector  onto  set  1 Projector  onto  set  2 Projector  onto  set  p Algorithm  to  compute  projection   onto  intersection Workflow model  update  direction gradient User  provided  code  for  FWI function  value projected  gradient  algorithm                                                                                               vector projected vector Bounds:  closed-­‐form  solution TV:  ADMM                     Projector  onto  set  p Dykstra’s  algorithm                               Workflow mk+1 rm f (mk ) f (mk ) User  provided  code  for  FWI projected  gradient  algorithm                                           mk+1 = PC (mk rm f (mk ))                                                     mk rm f (mk ) PC (mk rm f (mk )) PC1 (mi )c= median{l mi , ui } Bounds:   losed-­‐ form  is,olution        P        C    2    (m)         = 1 arg minx 2 kx 2 mk2 s.t kzk1  σ , rx = z Projector  onto  set  p Dykstra’s  algorithm                        \  p                     PC (m) = arg min kx x mk2 s.t. x2 Ci . i=1 Numerical example 1 – revisited • FWI  w/  TV-­‐norm  &  bound  constraints • data  w/  zero-­‐mean  random  noise, knoisek2 /ksignalk2 = 0.25 • starting  model  =  smoothed  true  model • frequency  batches  from  3Hz  to  10  Hz FWI w/ constraints 0.15τ0 0.25τ0 0.5τ0 0.15 TV(m * ) 0.25 TV(m * ) 0.5 TV(m * ) 0.75τ0 0.75 TV(m * ) 0 0 1000 1000 1000 1000 2000 2000 2000 2000 3000 3000 3000 3000 0 0 1000 2000 3000 0 1000 2000 3000 0 1000 2000 0 3000 0 1000 2000 3000 Penalties vs. constraints Regularization  w/  penalties:              ✏    combinations  behave  unpredictably ‣ inversion  results  for  various    ↵, ‣ challenges  proper  parameter  settings ‣ offers  no  guarantees  of  feasibility  for  each  model  iterate Regularization  w/  constraints: ‣ inversion  results  behave  predictably  for  increasing  τ ‣ edges  are  preserved  for  not  too  large  τ ‣ inversion  artifacts  appear  for  too  large  τ Suggests  cooling  technique  w/  warm  starts  where  τ        is  increased  slowly... Case study. Improve delineation of salt for a good starting model but poor (6 dB) data... Billette, F., and Brandsberg-Dahl, S., 2005 Reduced BP model – modelling parameters ‣ number  of  sources:  132;  number  of  receivers:  311 ‣ receiver  spacing:  40m,  source  spacing:  80m,  max  offset  11.5  km ‣ grid  size:  20  m ‣ known  Ricker  wavelet  sources  with  15Hz  peak  frequency ‣ data  available  starting  at  3  Hz ‣ 8  simultaneous  shots  w/  Gaussian  weights  w/  redraws ‣ starting  model  =  smoothed  true  model ‣ inversion  crime  but  poor  data True velocity model – reduced by a factor of 2.5 True velocity model 0 5000 4500 500 4000 1000 z [m] 3500 1500 3000 2000 2500 2500 2000 1500 0 37 2000 4000 6000 x [m] 8000 10000 12000 Starting model Initial velocity model 5000 0 4500 500 4000 1000 z [m] 3500 1500 3000 2000 2500 2500 2000 3000 1500 0 38 2000 4000 6000 x [m] 8000 10000 12000 Adjoint-state w/ noisy data 50% noise, FWI, bounds only, 3rd cycle 5000 0 4500 500 4000 1000 z [m] 3500 1500 3000 2000 2500 2500 2000 3000 1500 0 39 2000 4000 6000 x [m] 8000 10000 12000 WRI /w TV-constraints 50% noise, WRI, TV, 3rd cycle 5000 0 4500 500 4000 1000 z [m] 3500 1500 3000 2000 2500 2500 2000 3000 1500 0 40 2000 4000 6000 x [m] 8000 10000 12000 Ernie Esser, Lluís Guasch, Felix J. Herrmann, and Mike Warner, “Constrained waveform inversion for automatic salt flooding”, The Leading Edge, vol. 35, p. 235-239, 2016 Ernie Esser, Lluís Guasch, Tristan van Leeuwen, Aleksandr Y. Aravkin, and Felix J. Herrmann, “Total-variation regularization strategies in fullwaveform inversion”. 2016 Heuristic Multiple  frequency  cycles: ‣ warm  starts ‣ increasingly  relaxed  TV  constraints  &  fixed  bound  constraints ‣ starts  w/  relaxed  TV-­‐norm  of  starting  model Extend  search  space: ‣ make  sure  data  is  fitted ‣ optimize  over  model  &  (source)  wavefields ‣ jointly  fit  data  &  wave  equation  (physics) ‣ use  noise  level  to  automatically  select  trade-­‐off  data  &  PDE  (physics)  fit   Wavefield Reconstruction Inversion – gradient Tristan van Leeuwen and Felix J. Herrmann, “A penalty method for PDE-constrained optimization in inverse problems”, Inverse Problems, vol. 32, p. 015007, 2015. Tristan van Leeuwen and Felix J. Herrmann, “Mitigating local minima in full-waveform inversion by expanding the search space”, Geophysical Journal International, vol. 195, p. 661-667, 2013 WRI  method: Adjoint-­‐state  method: for each source i for each source i solve A(m)ui = qi ∗ ∗ solve A(m) vi = Pi (Pi ui − di ) solve ✓ Pi λAi (m) 2 2 ◆ uλ,i ≈ ✓ di λqi ◆ 2 g = g + ω diag(ui ) vi g = g + λ ω diag(ūi,λ ) (A(m)ūi,λ − qi ) ∗ end correlation  proxy   wavefield  &  PDE   residual end Patent application WO2014172787 – A PENALTY METHOD FOR PDE-CONSTRAINED OPTIMIZATION pending ∗ correlation   wavefield  &   data  residual BP model – inversion parameters Optimization  specs: ‣ spectral-­‐projected  gradients ‣ non-­‐monotone  linesearch  w/  window  size  of  5 ‣ max  8  DYKSTRA  iterations Constraint  specs: ‣ frequency  continuation  3–9  Hz  in  consecutive  batches  of  2 0 ‣ 3  warm  started  frequency  sweeps    w/                                                                    &    τ = 1.00 × TV(m0 ) ‣ anisotropic  TV 1st cycle cycle knoisek2 /ksignalk2 = 0.25 25% noise, FWI, bounds only, 1st cycle 25% noise, WRI, bounds only, 1st cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  only z [m] 3500 1500 1500 3000 3000 2000 2000 2500 2500 2500 2500 2000 3000 3000 1500 0 2000 4000 6000 8000 10000 2000 12000 1500 0 2000 4000 x [m] 6000 8000 10000 12000 x [m] FWI WRI 25% noise, FWI, TV, 1st cycle 25% noise, WRI, TV, 1st cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  &  TV z [m] 3500 1500 1500 3000 2000 3000 2000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 2nd cycle knoisek2 /ksignalk2 = 0.25 25% noise, FWI, bounds only, 2nd cycle 25% noise, WRI, bounds only, 2nd cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  only z [m] 3500 1500 1500 3000 3000 2000 2000 2500 2500 2500 2500 2000 3000 3000 1500 0 2000 4000 6000 8000 10000 2000 12000 1500 0 2000 4000 x [m] 6000 8000 10000 12000 x [m] FWI WRI 25% noise, FWI, TV, 2nd cycle 25% noise, WRI, TV, 2nd cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  &  TV z [m] 3500 1500 1500 3000 2000 3000 2000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 3rd cycle knoisek2 /ksignalk2 = 0.25 25% noise, FWI, bounds only, 3rd cycle 25% noise, WRI, bounds only, 3rd cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  only z [m] 3500 1500 1500 3000 3000 2000 2000 2500 2500 2500 2500 2000 3000 3000 1500 0 2000 4000 6000 8000 10000 2000 12000 1500 0 2000 4000 x [m] 6000 8000 10000 12000 x [m] FWI WRI 25% noise, FWI, TV, 3rd cycle 25% noise, WRI, TV, 3rd cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  &  TV z [m] 3500 1500 1500 3000 2000 3000 2000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 1st cycle cycle 50% noise, FWI, bounds only, 1st cycle 50% noise, WRI, bounds only, 1st cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  only z [m] 3500 1500 1500 3000 3000 2000 2000 2500 2500 2500 2500 2000 3000 3000 1500 0 2000 4000 6000 8000 10000 2000 12000 1500 0 2000 4000 x [m] 6000 8000 10000 12000 x [m] FWI WRI 50% noise, FWI, TV, 1st cycle 50% noise, WRI, TV, 1st cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  &  TV z [m] 3500 1500 1500 3000 2000 3000 2000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 2nd cycle 50% noise, FWI, bounds only, 2nd cycle 50% noise, WRI, bounds only, 2nd cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  only z [m] 3500 1500 1500 3000 3000 2000 2000 2500 2500 2500 2500 2000 3000 3000 1500 0 2000 4000 6000 8000 10000 2000 12000 1500 0 2000 4000 x [m] 6000 8000 10000 12000 x [m] FWI WRI 50% noise, FWI, TV, 2nd cycle 50% noise, WRI, TV, 2nd cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  &  TV z [m] 3500 1500 1500 3000 2000 3000 2000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 3rd cycle 50% noise, FWI, bounds only, 3rd cycle 50% noise, WRI, bounds only, 3rd cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  only z [m] 3500 1500 1500 3000 3000 2000 2000 2500 2500 2500 2500 2000 3000 3000 1500 0 2000 4000 6000 8000 10000 2000 12000 1500 0 2000 4000 x [m] 6000 8000 10000 12000 x [m] FWI WRI 50% noise, FWI, TV, 3rd cycle 50% noise, WRI, TV, 3rd cycle 5000 0 4500 500 5000 0 4500 500 4000 4000 1000 1000 3500 z [m] bounds  &  TV z [m] 3500 1500 1500 3000 2000 3000 2000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 2500 2500 2000 3000 1500 0 2000 4000 6000 x [m] 8000 10000 12000 Conclusions & generalizations Adding  constraints  to  inversion: ‣ leaves  gradient  (and  Hessian)  untouched   ‣ is  robust  &  behaves  predictably  w/  model  iterates  that  remain  feasible ‣ intuitive  parameterizations  of  prior  information  in  >2  constraints ‣ can  be  accelerated  w/  quasi-­‐Newton  &  parallelized  projections ‣ “black  box”  solution  that  work  w/  any  implementation  for  FWI/WRI Extensions  paired  w/  constraints  are  a  powerful  combination! Acknowledgements This  research  was  carried  out  as  part  of  the  SINBAD  project  with  the   support  of  the  member  organizaoons  of  the  SINBAD  Consoroum. 51