Yes, I am sure that the python is in 64-bits too. I run the following
command on python:
>> import
platform
>> platform.architecture()
('64bit', 'WindowsPE')
Mochammad Husni Rizal
B.Sc. (Geophysics) - Universitas Gadjah Mada
Phone : +62-857-151-22090
On Fri, Nov 20, 2015 at 3:14 PM, Thomas Lecocq <thlecocq(a)gmail.com> wrote:
Hi,
The number of days for the analysis should not be an issue, msnoise
processes day by day... I suppose python is 64bits too, right?
Thomas
Le 20 nov. 2015 08:40, "Mochammad Husni Rizal" <mochhusnir(a)gmail.com> a
écrit :
Yes, I am using windows 8 64-bits (8 GB RAM).
There are 7 stations.
My configuration is as follows:
>msnoise info
General:
- db.ini is present
Configuration:
- E:\CONV exists
- CROSS_CORRELATIONS does not exists (and that is normal because
keep_all=False)
Raw config bits: "D"efault or "M"odified (green)
M data_folder: E:\CONV
D output_folder: CROSS_CORRELATIONS
D data_structure: SDS
D network: *
D channels: *
M startdate: 2013-01-01
M enddate: 2016-01-01
D analysis_duration: 86400
D cc_sampling_rate: 20.0
D resampling_method: Resample
D decimation_factor: 5
D preprocess_lowpass: 8.0
D preprocess_highpass: 0.01
D maxlag: 120.
D corr_duration: 1800.
D overlap: 0.0
D windsorizing: 3
D crondays: -1
D ZZ: Y
D ZR: N
D ZT: N
D RZ: N
D RR: N
D RT: N
D TZ: N
D TR: N
D TT: N
D autocorr: N
D PAZ: N
D keep_all: N
D keep_days: Y
M ref_begin: 2013-01-01
M ref_end: 2016-01-01
D mov_stack: 5
D export_format: MSEED
D sac_format: doublets
D dtt_lag: static
D dtt_v: 1.0
D dtt_minlag: 5.0
D dtt_width: 30.0
D dtt_sides: both
D dtt_mincoh: 0.65
D dtt_maxerr: 0.1
D dtt_maxdt: 0.1
CC Jobs:
I : 2
T : 5573
DTT Jobs:
I use following filter:
Low: 0.1
mwcs low: 0.15
mwcs high: 0.95
high: 1.0
RMS threshold: 1.0
mwcs wlen: 12.0
mwcs step: 4.0
*Note:*
I use the similar configuration and filter to process the same data but
with a smaller data array (only one month of data) and it worked.
Warm regards,
Mochammad Husni Rizal
B.Sc. (Geophysics) - Universitas Gadjah Mada
Phone : +62-857-151-22090
On Fri, Nov 20, 2015 at 2:26 PM, Thomas Lecocq <
thomas.lecocq(a)seismology.be>
wrote:
> Hi,
>
> It seems indeed that the project is quite large. Are you using a 64bits
> machine ? How many stations are there ?
>
> Cheers,
>
> Thomas
>
>
> Le 20/11/2015 05:30, Mochammad Husni Rizal a écrit :
>
>> Dear MSNoise enthusiast,
>>
>> I am currently processing data of Merapi volcano from 7 stations
during
>> January 2013 - October 2015. When I
type:
>>
>>> msnoise compute_cc
>>>
>> I get:
>>
>> 2015-11-20 11:08:14 [INFO] *** Starting: Compute CC ***
>>
>>> 2015-11-20 11:08:14 [INFO] Will compute ZZ
>>> 2015-11-20 11:08:14 [INFO] New CC Job: 2013-11-16 (1 pairs with 2
>>> stations)
>>> Traceback (most recent call last):
>>> File "C:\Users\Dell\Anaconda\Scripts\msnoise-script.py", line
9,
in
>>> <module>
>>> load_entry_point('msnoise==1.3.1',
'console_scripts',
'msnoise')()
>>> File
>>>
"C:\Users\Dell\Anaconda\lib\site-packages\msnoise\scripts\msnoise.py",
> >>> line
> >>> 393, in run
> >>> cli(obj={})
>>> File
>>>
> >>>
>
"C:\Users\Dell\Anaconda\lib\site-packages\click-4.0-py2.7.egg\click\core.py",
> >>> line 664, in __call__
> >>> return self.main(*args, **kwargs)
>>> File
>>>
> >>>
>
"C:\Users\Dell\Anaconda\lib\site-packages\click-4.0-py2.7.egg\click\core.py",
> >>> line 644, in main
> >>> rv = self.invoke(ctx)
>>> File
>>>
> >>>
>
"C:\Users\Dell\Anaconda\lib\site-packages\click-4.0-py2.7.egg\click\core.py",
> >>> line 991, in invoke
> >>> return _process_result(sub_ctx.command.invoke(sub_ctx))
>>> File
>>>
> >>>
>
"C:\Users\Dell\Anaconda\lib\site-packages\click-4.0-py2.7.egg\click\core.py",
> >>> line 837, in invoke
> >>> return ctx.invoke(self.callback, **ctx.params)
>>> File
>>>
> >>>
>
"C:\Users\Dell\Anaconda\lib\site-packages\click-4.0-py2.7.egg\click\core.py",
> >>> line 464, in invoke
> >>> return callback(*args, **kwargs)
>>> File
>>>
"C:\Users\Dell\Anaconda\lib\site-packages\msnoise\scripts\msnoise.py",
> >>> line
> >>> 174, in compute_cc
> >>> main()
> >>> File
> >>>
"C:\Users\Dell\Anaconda\lib\site-packages\msnoise\s03compute_cc.py",
> line
> >>> 271, in main
> >>> basetime, tramef_Z = preprocess(db, stations, comps, goal_day,
> >>> params,
> >>> tramef_Z)
> >>> File
> >>>
"C:\Users\Dell\Anaconda\lib\site-packages\msnoise\s03compute_cc.py",
> line
> >>> 148, in preprocess
> >>> stream.merge(method=0, fill_value=0.0)
>>> File
>>>
> >>>
>
"C:\Users\Dell\Anaconda\lib\site-packages\obspy-0.10.2-py2.7-win-amd64.egg\obspy\core\stream.py",
> >>> line 1823, in merge
> >>> interpolation_samples=interpolation_samples)
>>> File
>>>
> >>>
>
"C:\Users\Dell\Anaconda\lib\site-packages\obspy-0.10.2-py2.7-win-amd64.egg\obspy\core\trace.py",
> line 786, in __add__
> data = np.concatenate(data)
> MemoryError
>
Is this because the size of data is too large (~120 GB)?
Please help. Thank you.
Warm regards,
Mochammad Husni Rizal
B.Sc. (Geophysics) - Universitas Gadjah Mada
Phone : +62-857-151-22090
_______________________________________________
MSNoise mailing list
MSNoise(a)mailman-as.oma.be
http://mailman-as.oma.be/mailman/listinfo/msnoise
_______________________________________________
MSNoise mailing list
MSNoise(a)mailman-as.oma.be
http://mailman-as.oma.be/mailman/listinfo/msnoise
_______________________________________________
MSNoise mailing list
MSNoise(a)mailman-as.oma.be
http://mailman-as.oma.be/mailman/listinfo/msnoise
_______________________________________________
MSNoise mailing list
MSNoise(a)mailman-as.oma.be
http://mailman-as.oma.be/mailman/listinfo/msnoise