Opened 5 years ago

Closed 5 years ago

Last modified 5 years ago

#270 closed bug (wontfix)

Mpich2 running error

Reported by: loc duong ding <mambom1902@…> Owned by:
Priority: major Milestone:
Component: mpich Keywords:
Cc:

Description

Dear,

I use Mpich2 to run Quantum Epresso code. I have a error when I try to run
execute file pw.x.

>>mpiexec -machinefile  /home/loc/machinefile -n 8 pw.x -npool 2  <
input_Graphite_GGA > output_Graphite_GGA2 &

application called MPI_Abort(MPI_COMM_WORLD, 0) - process 1
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 3[cli_3]: aborting
job                   :
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 3
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 7[cli_7]: aborting
job                   :
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 7
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 2[cli_2]: aborting
job                   :
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 2
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 6[cli_6]: aborting
job                   :
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 6
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 5[cli_5]: aborting
job                   :
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 5
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 4[cli_4]: aborting
job                   :
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 4

In the output file, the error is:

rank 7 in job 2  master_49510   caused collective abort of all ranks
  exit status of rank 7: return code 0
rank 3 in job 2  master_49510   caused collective abort of all ranks
  exit status of rank 3: return code 0
rank 1 in job 2  master_49510   caused collective abort of all ranks
  exit status of rank 1: return code 0


I tried to test mpiexec command whith some another execute file and there was no
problem. The pw.x is also ok when it was run alone (serial running).

Please give me some suggestion for this error.

Best regards,
Duong Dinh Loc. 



Attachments (7)

part0001.html (2.9 KB) - added by loc duong ding 5 years ago.
Added by email2trac
part0001.2.html (5.5 KB) - added by loc duong ding 5 years ago.
Added by email2trac
part0001.3.html (8.3 KB) - added by loc duong ding 5 years ago.
Added by email2trac
part0001.4.html (22.0 KB) - added by loc duong ding 5 years ago.
Added by email2trac
config.enable_timer.log (211.3 KB) - added by loc duong ding 5 years ago.
Added by email2trac
config.log (188.2 KB) - added by loc duong ding 5 years ago.
Added by email2trac
part0001.5.html (13.1 KB) - added by loc duong ding 5 years ago.
Added by email2trac

Download all attachments as: .zip

Change History (17)

Changed 5 years ago by loc duong ding

Added by email2trac

comment:1 Changed 5 years ago by loc duong ding

  • id set to 270

This message has 1 attachment(s)

comment:2 Changed 5 years ago by Pavan Balaji


>  application called MPI_Abort(MPI_COMM_WORLD, 0) - process 1
>  application called MPI_Abort(MPI_COMM_WORLD, 0) - process 3[cli_3]:
>  aborting
>  job                   :

As the error message suggests, your application is calling MPI_Abort.
You can search through your code to see why it's causing MPI_Abort.

  -- Pavan

--
Pavan Balaji
http://www.mcs.anl.gov/~balaji

comment:3 Changed 5 years ago by thakur

  • Resolution set to wontfix
  • Status changed from new to closed

Changed 5 years ago by loc duong ding

Added by email2trac

comment:4 Changed 5 years ago by loc duong ding

Dear,
I see this problem again and I check the error. The error in the out file is:

 total energy              =  -313.00559496 Ry
     Harris-Foulkes estimate   =  -314.62641196 Ry
     estimated scf accuracy    <     0.04292274 Ry
     total magnetization       =     2.00 Bohr mag/cell
     absolute magnetization    =     3.41 Bohr mag/cell
     iteration #  2     ecut=    35.00 Ry     beta=0.60
mpiexec_master: mpd_uncaught_except_tb handling:
  exceptions.IOError: [Errno 5] Input/output error
    /opt/mpich2/bin/mpiexec  1051  handle_cli_stderr_input
        sys.stderr.write(msg)
    /opt/mpich2/bin/mpdlib.py  762  handle_active_streams
        handler(stream,*args)
    /opt/mpich2/bin/mpiexec  515  mpiexec
        rv = streamHandler.handle_active_streams(timeout=1.0)
    /opt/mpich2/bin/mpiexec  1423  ?
        mpiexec()

 
This error interprut the running process.
I don't know where the error comes from and how I can solve this problem. Please
help me.
 
Best regards,
Loc Duong Dinh
SAINT, Sungkyunkwan University, South Korean.
---------------------------------------------------+------------------------
>Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:   
  >  Type:  bug                                    |      Status:  new
>Priority:  major                                  |    Milestone:   
>Component:  mpich2                                |  Resolution:   
>Keywords:                                        | 
---------------------------------------------------+------------------------


>Comment (by Pavan Balaji):

>{{{


>>  application called MPI_Abort(MPI_COMM_WORLD, 0) - process 1
>>  application called MPI_Abort(MPI_COMM_WORLD, 0) - process 3[cli_3]:
>>  aborting
>>  job                  :

>As the error message suggests, your application is calling MPI_Abort.
>You can search through your code to see why it's causing MPI_Abort.

  >-- Pavan

--
>Pavan Balaji
>http://www.mcs.anl.gov/~balaji
>}}}

--
>Ticket URL: <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>


________________________________

>From: mpich2 <mpich2-maint@mcs.anl.gov>
>Sent: Thursday, November 6, 2008 1:03:22 AM
>Subject: Re: [mpich2-maint] #270: Mpich2 running error


comment:5 Changed 5 years ago by Rajeev Thakur

Does that same error happen repeatably?

Rajeev

> -----Original Message-----
> From: mpich2-bugs-bounces@mcs.anl.gov
> [mailto:mpich2-bugs-bounces@mcs.anl.gov] On Behalf Of mpich2
> Sent: Monday, December 08, 2008 6:50 AM
> To: undisclosed-recipients:
> Subject: Re: [mpich2-maint] #270: Mpich2 running error
>
> ---------------------------------------------------+----------
> --------------
>  Reporter:  loc duong ding <mambom1902@yahoo.com>  |
> Owner:
>      Type:  bug                                    |
> Status:  closed
>  Priority:  major                                  |
> Milestone:
> Component:  mpich2                                 |
> Resolution:  wontfix
>  Keywords:                                         |
> ---------------------------------------------------+----------
> --------------
>
>
> Comment (by loc duong ding):
>
>  {{{
>
>  Dear,
>  I see this problem again and I check the error. The error in
> the out file
>  is:
>
>   total energy              =  -313.00559496 Ry
>       Harris-Foulkes estimate   =  -314.62641196 Ry
>       estimated scf accuracy    <     0.04292274 Ry
>       total magnetization       =     2.00 Bohr mag/cell
>       absolute magnetization    =     3.41 Bohr mag/cell
>       iteration #  2     ecut=    35.00 Ry     beta=0.60
>  mpiexec_master: mpd_uncaught_except_tb handling:
>    exceptions.IOError: [Errno 5] Input/output error
>      /opt/mpich2/bin/mpiexec  1051  handle_cli_stderr_input
>          sys.stderr.write(msg)
>      /opt/mpich2/bin/mpdlib.py  762  handle_active_streams
>          handler(stream,*args)
>      /opt/mpich2/bin/mpiexec  515  mpiexec
>          rv = streamHandler.handle_active_streams(timeout=1.0)
>      /opt/mpich2/bin/mpiexec  1423  ?
>          mpiexec()
>
>
>  This error interprut the running process.
>  I don't know where the error comes from and how I can solve
> this problem.
>  Please
>  help me.
>
>  Best regards,
>  Loc Duong Dinh
>  SAINT, Sungkyunkwan University, South Korean.
>
> ---------------------------------------------------+----------
> --------------
>  >Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:
>    >  Type:  bug                                    |     
> Status:  new
>  >Priority:  major                                  |    Milestone:
>  >Component:  mpich2                                |  Resolution:
>  >Keywords:                                        |
>
> ---------------------------------------------------+----------
> --------------
>
>
>  >Comment (by Pavan Balaji):
>
>  >{{{
>
>
>  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) - process 1
>  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) -
> process 3[cli_3]:
>  >>  aborting
>  >>  job                  :
>
>  >As the error message suggests, your application is calling
> MPI_Abort.
>  >You can search through your code to see why it's causing MPI_Abort.
>
>    >-- Pavan
>
>  --
>  >Pavan Balaji
>  >http://www.mcs.anl.gov/~balaji
>  >}}}
>
>  --
>  >Ticket URL:
>  <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>
>
>  ________________________________
>
>  >From: mpich2 <mpich2-maint@mcs.anl.gov>
>  >Sent: Thursday, November 6, 2008 1:03:22 AM
>  >Subject: Re: [mpich2-maint] #270: Mpich2 running error
>
>
>  }}}
>
> --
> Ticket URL:
> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>

Changed 5 years ago by loc duong ding

Added by email2trac

comment:6 Changed 5 years ago by loc duong ding

 
Yes, this error happens repeatably.

Best,
Loc Duong Dinh
>---------------------------------------------------+------------------------
>Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:       
  >  Type:  bug                                    |      Status:  closed
>Priority:  major                                  |    Milestone:       
><Component:  mpich2                                |  Resolution:  wontfix.
>Keywords:                                        | 
---------------------------------------------------+------------------------


>Comment (by Rajeev Thakur):

>{{{

>Does that same error happen repeatably?

>Rajeev

>> -----Original Message-----
>> From: mpich2-bugs-bounces@mcs.anl.gov
>> [mailto:mpich2-bugs-bounces@mcs.anl.gov] On Behalf Of mpich2
>> Sent: Monday, December 08, 2008 6:50 AM
>> To: undisclosed-recipients:
>> Subject: Re: [mpich2-maint] #270: Mpich2 running error
>>
>> ---------------------------------------------------+----------
>> --------------
>>  Reporter:  loc duong ding <mambom1902@yahoo.com>  |
>> Owner:
>>      Type:  bug                                    |
>> Status:  closed
>>  Priority:  major                                  |
>> Milestone:
>> Component:  mpich2                                |
>> Resolution:  wontfix
>>  Keywords:                                        |
>> ---------------------------------------------------+----------
>> --------------
>>
>
>> Comment (by loc duong ding):
>>
>  >{{{
>>
>  >Dear,
>  I see this problem again and I check the error. The error in
> the out file
>  is:
>
>   total energy              =  -313.00559496 Ry
>       Harris-Foulkes estimate   =  -314.62641196 Ry
>       estimated scf accuracy    <     0.04292274 Ry
>       total magnetization       =     2.00 Bohr mag/cell
>       absolute magnetization    =     3.41 Bohr mag/cell
>       iteration #  2     ecut=    35.00 Ry     beta=0.60
>  mpiexec_master: mpd_uncaught_except_tb handling:
>    exceptions.IOError: [Errno 5] Input/output error
>      /opt/mpich2/bin/mpiexec  1051  handle_cli_stderr_input
>          sys.stderr.write(msg)
>      /opt/mpich2/bin/mpdlib.py  762  handle_active_streams
>          handler(stream,*args)
>      /opt/mpich2/bin/mpiexec  515  mpiexec
>          rv = streamHandler.handle_active_streams(timeout=1.0)
>      /opt/mpich2/bin/mpiexec  1423  ?
>          mpiexec()
>
>
>  This error interprut the running process.
>  I don't know where the error comes from and how I can solve
> this problem.
>  Please
>  help me.
>
>  Best regards,
>  Loc Duong Dinh
>  SAINT, Sungkyunkwan University, South Korean.
>
> ---------------------------------------------------+----------
> --------------
>  >Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:
>    >  Type:  bug                                    |
> Status:  new
>  >Priority:  major                                  |    Milestone:
>  >Component:  mpich2                                |  Resolution:
>  >Keywords:                                        |
>
> ---------------------------------------------------+----------
> --------------
>
>
>  >Comment (by Pavan Balaji):
>
>  >{{{
>
>
>  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) - process 1
>  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) -
> process 3[cli_3]:
>  >>  aborting
>  >>  job                  :
>
>  >As the error message suggests, your application is calling
> MPI_Abort.
>  >You can search through your code to see why it's causing MPI_Abort.
>
>    >-- Pavan
>
>  --
>  >Pavan Balaji
>  >http://www.mcs.anl.gov/~balaji
>  >}}}
>
>  --
>  >Ticket URL:
>  <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>
>
>  ________________________________
>
>  >From: mpich2 <mpich2-maint@mcs.anl.gov>
>  >Sent: Thursday, November 6, 2008 1:03:22 AM
>  >Subject: Re: [mpich2-maint] #270: Mpich2 running error
>
>
>  }}}
>
> --
> Ticket URL:
>> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>>

--
Ticket URL: <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>

}}}

comment:7 Changed 5 years ago by Rajeev Thakur

Hard to say from this output what the problem might be.

Rajeev

> -----Original Message-----
> From: mpich2-bugs-bounces@mcs.anl.gov
> [mailto:mpich2-bugs-bounces@mcs.anl.gov] On Behalf Of mpich2
> Sent: Monday, December 08, 2008 6:05 PM
> To: undisclosed-recipients:
> Subject: Re: [mpich2-maint] #270: Mpich2 running error
>
> ---------------------------------------------------+----------
> --------------
>  Reporter:  loc duong ding <mambom1902@yahoo.com>  |
> Owner:
>      Type:  bug                                    |
> Status:  closed
>  Priority:  major                                  |
> Milestone:
> Component:  mpich2                                 |
> Resolution:  wontfix
>  Keywords:                                         |
> ---------------------------------------------------+----------
> --------------
>
>
> Comment (by loc duong ding):
>
>  {{{
>
>
>  Yes, this error happens repeatably.
>
>  Best,
>  Loc Duong Dinh
>
> >---------------------------------------------------+---------
> ---------------
>  >Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:
>    >  Type:  bug                                    |     
> Status:  closed
>  >Priority:  major                                  |    Milestone:
>  ><Component:  mpich2                                |  Resolution: 
>  wontfix.
>  >Keywords:                                        |
>
> ---------------------------------------------------+----------
> --------------
>
>
>  >Comment (by Rajeev Thakur):
>
>  >{{{
>
>  >Does that same error happen repeatably?
>
>  >Rajeev
>
>  >> -----Original Message-----
>  >> From: mpich2-bugs-bounces@mcs.anl.gov
>  >> [mailto:mpich2-bugs-bounces@mcs.anl.gov] On Behalf Of mpich2
>  >> Sent: Monday, December 08, 2008 6:50 AM
>  >> To: undisclosed-recipients:
>  >> Subject: Re: [mpich2-maint] #270: Mpich2 running error
>  >>
>  >> ---------------------------------------------------+----------
>  >> --------------
>  >>  Reporter:  loc duong ding <mambom1902@yahoo.com>  |
>  >> Owner:
>  >>      Type:  bug                                    |
>  >> Status:  closed
>  >>  Priority:  major                                  |
>  >> Milestone:
>  >> Component:  mpich2                                |
>  >> Resolution:  wontfix
>  >>  Keywords:                                        |
>  >> ---------------------------------------------------+----------
>  >> --------------
>  >>
>  >
>  >> Comment (by loc duong ding):
>  >>
>  >  >{{{
>  >>
>  >  >Dear,
>  >  I see this problem again and I check the error. The error in
>  > the out file
>  >  is:
>  >
>  >   total energy              =  -313.00559496 Ry
>  >       Harris-Foulkes estimate   =  -314.62641196 Ry
>  >       estimated scf accuracy    <     0.04292274 Ry
>  >       total magnetization       =     2.00 Bohr mag/cell
>  >       absolute magnetization    =     3.41 Bohr mag/cell
>  >       iteration #  2     ecut=    35.00 Ry     beta=0.60
>  >  mpiexec_master: mpd_uncaught_except_tb handling:
>  >    exceptions.IOError: [Errno 5] Input/output error
>  >      /opt/mpich2/bin/mpiexec  1051  handle_cli_stderr_input
>  >          sys.stderr.write(msg)
>  >      /opt/mpich2/bin/mpdlib.py  762  handle_active_streams
>  >          handler(stream,*args)
>  >      /opt/mpich2/bin/mpiexec  515  mpiexec
>  >          rv = streamHandler.handle_active_streams(timeout=1.0)
>  >      /opt/mpich2/bin/mpiexec  1423  ?
>  >          mpiexec()
>  >
>  >
>  >  This error interprut the running process.
>  >  I don't know where the error comes from and how I can solve
>  > this problem.
>  >  Please
>  >  help me.
>  >
>  >  Best regards,
>  >  Loc Duong Dinh
>  >  SAINT, Sungkyunkwan University, South Korean.
>  >
>  > ---------------------------------------------------+----------
>  > --------------
>  >  >Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:
>  >    >  Type:  bug                                    |
>  > Status:  new
>  >  >Priority:  major                                  |    Milestone:
>  >  >Component:  mpich2                                |  Resolution:
>  >  >Keywords:                                        |
>  >
>  > ---------------------------------------------------+----------
>  > --------------
>  >
>  >
>  >  >Comment (by Pavan Balaji):
>  >
>  >  >{{{
>  >
>  >
>  >  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) - process 1
>  >  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) -
>  > process 3[cli_3]:
>  >  >>  aborting
>  >  >>  job                  :
>  >
>  >  >As the error message suggests, your application is calling
>  > MPI_Abort.
>  >  >You can search through your code to see why it's causing
> MPI_Abort.
>  >
>  >    >-- Pavan
>  >
>  >  --
>  >  >Pavan Balaji
>  >  >http://www.mcs.anl.gov/~balaji
>  >  >}}}
>  >
>  >  --
>  >  >Ticket URL:
>  >  <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >
>  >
>  >  ________________________________
>  >
>  >  >From: mpich2 <mpich2-maint@mcs.anl.gov>
>  >  >Sent: Thursday, November 6, 2008 1:03:22 AM
>  >  >Subject: Re: [mpich2-maint] #270: Mpich2 running error
>  >
>  >
>  >  }}}
>  >
>  > --
>  > Ticket URL:
>  >> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >>
>
>  }}}
>
>  --
>  Ticket URL:
> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>
>
>
>  }}}
>
> --
> Ticket URL:
> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>

comment:8 Changed 5 years ago by Anthony Chan


Since it is a repeatable I/O related issue, would
you be able to send us a small program that
illustrates the problem ?

A.Chan
----- "mpich2" <mpich2-maint@mcs.anl.gov> wrote:

> ---------------------------------------------------+------------------------
>  Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:
>
>      Type:  bug                                    |       Status:
> closed
>  Priority:  major                                  |    Milestone:
>
> Component:  mpich2                                 |   Resolution:
> wontfix
>  Keywords:                                         |
> ---------------------------------------------------+------------------------
>
>
> Comment (by Rajeev Thakur):
>
>  {{{
>
>  Hard to say from this output what the problem might be.
>
>  Rajeev
>
>  > -----Original Message-----
>  > From: mpich2-bugs-bounces@mcs.anl.gov
>  > [mailto:mpich2-bugs-bounces@mcs.anl.gov] On Behalf Of mpich2
>  > Sent: Monday, December 08, 2008 6:05 PM
>  > To: undisclosed-recipients:
>  > Subject: Re: [mpich2-maint] #270: Mpich2 running error
>  >
>  > ---------------------------------------------------+----------
>  > --------------
>  >  Reporter:  loc duong ding <mambom1902@yahoo.com>  |
>  > Owner:
>  >      Type:  bug                                    |
>  > Status:  closed
>  >  Priority:  major                                  |
>  > Milestone:
>  > Component:  mpich2                                 |
>  > Resolution:  wontfix
>  >  Keywords:                                         |
>  > ---------------------------------------------------+----------
>  > --------------
>  >
>  >
>  > Comment (by loc duong ding):
>  >
>  >  {{{
>  >
>  >
>  >  Yes, this error happens repeatably.
>  >
>  >  Best,
>  >  Loc Duong Dinh
>  >
>  > >---------------------------------------------------+---------
>  > ---------------
>  >  >Reporter:  loc duong ding <mambom1902@yahoo.com>  |       
> Owner:
>  >    >  Type:  bug                                    |
>  > Status:  closed
>  >  >Priority:  major                                  | 
 
> Milestone:
>  >  ><Component:  mpich2                                | 
> Resolution:
>  >  wontfix.
>  >  >Keywords:                                        |
>  >
>  > ---------------------------------------------------+----------
>  > --------------
>  >
>  >
>  >  >Comment (by Rajeev Thakur):
>  >
>  >  >{{{
>  >
>  >  >Does that same error happen repeatably?
>  >
>  >  >Rajeev
>  >
>  >  >> -----Original Message-----
>  >  >> From: mpich2-bugs-bounces@mcs.anl.gov
>  >  >> [mailto:mpich2-bugs-bounces@mcs.anl.gov] On Behalf Of mpich2
>  >  >> Sent: Monday, December 08, 2008 6:50 AM
>  >  >> To: undisclosed-recipients:
>  >  >> Subject: Re: [mpich2-maint] #270: Mpich2 running error
>  >  >>
>  >  >> ---------------------------------------------------+----------
>  >  >> --------------
>  >  >>  Reporter:  loc duong ding <mambom1902@yahoo.com>  |
>  >  >> Owner:
>  >  >>      Type:  bug                                 
  |
>  >  >> Status:  closed
>  >  >>  Priority:  major                                  |
>  >  >> Milestone:
>  >  >> Component:  mpich2                                |
>  >  >> Resolution:  wontfix
>  >  >>  Keywords:                                       
|
>  >  >> ---------------------------------------------------+----------
>  >  >> --------------
>  >  >>
>  >  >
>  >  >> Comment (by loc duong ding):
>  >  >>
>  >  >  >{{{
>  >  >>
>  >  >  >Dear,
>  >  >  I see this problem again and I check the error. The error in
>  >  > the out file
>  >  >  is:
>  >  >
>  >  >   total energy              =  -313.00559496 Ry
>  >  >       Harris-Foulkes estimate   =  -314.62641196 Ry
>  >  >       estimated scf accuracy    <     0.04292274 Ry
>  >  >       total magnetization       =     2.00 Bohr mag/cell
>  >  >       absolute magnetization    =     3.41 Bohr mag/cell
>  >  >       iteration #  2     ecut=    35.00 Ry    
beta=0.60
>  >  >  mpiexec_master: mpd_uncaught_except_tb handling:
>  >  >    exceptions.IOError: [Errno 5] Input/output error
>  >  >      /opt/mpich2/bin/mpiexec  1051  handle_cli_stderr_input
>  >  >          sys.stderr.write(msg)
>  >  >      /opt/mpich2/bin/mpdlib.py  762  handle_active_streams
>  >  >          handler(stream,*args)
>  >  >      /opt/mpich2/bin/mpiexec  515  mpiexec
>  >  >          rv = streamHandler.handle_active_streams(timeout=1.0)
>  >  >      /opt/mpich2/bin/mpiexec  1423  ?
>  >  >          mpiexec()
>  >  >
>  >  >
>  >  >  This error interprut the running process.
>  >  >  I don't know where the error comes from and how I can solve
>  >  > this problem.
>  >  >  Please
>  >  >  help me.
>  >  >
>  >  >  Best regards,
>  >  >  Loc Duong Dinh
>  >  >  SAINT, Sungkyunkwan University, South Korean.
>  >  >
>  >  > ---------------------------------------------------+----------
>  >  > --------------
>  >  >  >Reporter:  loc duong ding <mambom1902@yahoo.com>  |       
> Owner:
>  >  >    >  Type:  bug                                 
  |
>  >  > Status:  new
>  >  >  >Priority:  major                                 
|   
> Milestone:
>  >  >  >Component:  mpich2                               
| 
> Resolution:
>  >  >  >Keywords:                                       
|
>  >  >
>  >  > ---------------------------------------------------+----------
>  >  > --------------
>  >  >
>  >  >
>  >  >  >Comment (by Pavan Balaji):
>  >  >
>  >  >  >{{{
>  >  >
>  >  >
>  >  >  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) - process
> 1
>  >  >  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) -
>  >  > process 3[cli_3]:
>  >  >  >>  aborting
>  >  >  >>  job                  :
>  >  >
>  >  >  >As the error message suggests, your application is calling
>  >  > MPI_Abort.
>  >  >  >You can search through your code to see why it's causing
>  > MPI_Abort.
>  >  >
>  >  >    >-- Pavan
>  >  >
>  >  >  --
>  >  >  >Pavan Balaji
>  >  >  >http://www.mcs.anl.gov/~balaji
>  >  >  >}}}
>  >  >
>  >  >  --
>  >  >  >Ticket URL:
>  >  >  <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >  >
>  >  >
>  >  >  ________________________________
>  >  >
>  >  >  >From: mpich2 <mpich2-maint@mcs.anl.gov>
>  >  >  >Sent: Thursday, November 6, 2008 1:03:22 AM
>  >  >  >Subject: Re: [mpich2-maint] #270: Mpich2 running error
>  >  >
>  >  >
>  >  >  }}}
>  >  >
>  >  > --
>  >  > Ticket URL:
>  >  >> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >  >>
>  >
>  >  }}}
>  >
>  >  --
>  >  Ticket URL:
>  > <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >
>  >
>  >
>  >  }}}
>  >
>  > --
>  > Ticket URL:
>  > <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >
>
>  }}}
>
> --
> Ticket URL:
> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>

Changed 5 years ago by loc duong ding

Added by email2trac

comment:9 Changed 5 years ago by loc duong ding

I checked this error carreffully again and found another information. THe CRASH
file of code is:
 
 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
     task #         6
     from  pzpotrf  : error #        54
      problems computing cholesky decomposition
 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

I search from Espresso forum and there is some explanation for this error.
http://www.democritos.it/pipermail/pw_forum/2008-June.txt
It may be not the error of MPICH2. But I don't sure about that.
If you want, I can give you my input file of this package which have the
problem.
You can download the package in:   http://www.pwscf.org/download.htm

This is my input file of this code:

&CONTROL
      calculation = 'vc-relax',
      prefix='C8OOH_model11_vcrelax',
      restart_mode = 'restart',
      pseudo_dir ='/home/loc/espresso-4.0/pseudo',
      outdir='./'
      tstress = .true. ,
      tprnfor = .true. ,
      nstep =  100  ,
      etot_conv_thr = 1.0E-4 ,
      forc_conv_thr = 1.0D-3 ,
      dt = 100 ,
;
&SYSTEM
      ibrav= 4,  celldm(1) = 9.6561, celldm(3)=2.6223, nat = 22, ntyp = 3,
nspin=2,
      ecutwfc =35, ecutrho = 210, occupations='smearing', degauss=0.0001,
      starting_magnetization(1) = 0.0,
      starting_magnetization(2) = 0.3,
      starting_magnetization(3) = 0.5   
;
&ELECTRONS
    startingwfc = 'atomic'
    mixing_mode = 'plain'
    mixing_beta = 0.6
    conv_thr = 1.0e-6
    electron_maxstep= 150
;
&IONS
    upscale = 15
;
&CELL
   cell_dynamics = 'bfgs' ,
   press = 0.00 ,
   wmass =  0.00150000  ,
;
ATOMIC_SPECIES
 C  12.011  C.pbe-rrkjus.UPF
 O  15.9994 O.pbe-rrkjus.UPF
 H  1.008   H.pbe-rrkjus.UPF
ATOMIC_POSITIONS {angstrom}
C       14.668639195  16.364791021  10.258918111
C       15.937254439  17.107960454  10.286714929
C       17.164561436  16.396716693  10.285152277
C       15.953243711  18.584610189  10.263683339
C       18.455194677  17.125292198   9.923043528
C       17.158408500  19.342595249  10.278747473
C       18.439908989  18.574325910  10.451748908
C       19.749065047  19.329036627  10.467833076
O       19.073785361  18.834350939  11.714491175
O       18.359047912  17.228522459   8.462096728
H       19.250644091  17.418265856   8.101422280
C       15.887265182  17.078752420  16.941524968
C       17.155880835  17.821106817  16.971572207
C       18.382991387  17.109537508  16.971809032
C       17.172522625  19.298060392  16.947640633
C       19.673132113  17.837864910  16.603847009
C       18.376934058  20.055968139  16.960975322
C       19.658524761  19.287069550  17.130483393
C       20.967474043  20.041325812  17.152120359
O       20.288654034  19.543055555  18.395941555
O       19.571576169  17.935034850  15.143694987
H       20.458291435  18.131756450  14.774538711
K_POINTS {automatic}
4 4 1 0 0 0

If it is not the error of MPICH2, I am really sorry for bothering you about my
confusions.
Thank for all.
 
Best regards,
Loc Duong Dinh.

________________________________
>From: mpich2 <mpich2-maint@mcs.anl.gov>
>Sent: Tuesday, December 9, 2008 2:35:10 PM
>Subject: Re: [mpich2-maint] #270: Mpich2 running error

---------------------------------------------------+------------------------
>Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:       
 >   Type:  bug                                    |      Status:  closed
>Priority:  major                                  |    Milestone:       
>Component:  mpich2                                |  Resolution:  wontfix
>Keywords:                                        | 
---------------------------------------------------+------------------------


>Comment (by Anthony Chan):

>{{{


>Since it is a repeatable I/O related issue, would
>you be able to send us a small program that
>illustrates the problem ?

>A.Chan
>----- "mpich2" <mpich2-maint@mcs.anl.gov> wrote:

>
---------------------------------------------------+------------------------
>  Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:
>
>      Type:  bug                                    |      Status:
> closed
>  Priority:  major                                  |    Milestone:
>
> Component:  mpich2                                |  Resolution:
> wontfix
>  Keywords:                                        |
>
---------------------------------------------------+------------------------
>
>
> Comment (by Rajeev Thakur):
>
>  {{{
>
>  Hard to say from this output what the problem might be.
>
>  Rajeev
>
>  > -----Original Message-----
>  > From: mpich2-bugs-bounces@mcs.anl.gov
>  > [mailto:mpich2-bugs-bounces@mcs.anl.gov] On Behalf Of mpich2
>  > Sent: Monday, December 08, 2008 6:05 PM
>  > To: undisclosed-recipients:
>  > Subject: Re: [mpich2-maint] #270: Mpich2 running error
>  >
>  > ---------------------------------------------------+----------
>  > --------------
>  >  Reporter:  loc duong ding <mambom1902@yahoo.com>  |
>  > Owner:
>  >      Type:  bug                                    |
>  > Status:  closed
>  >  Priority:  major                                  |
>  > Milestone:
>  > Component:  mpich2                                |
>  > Resolution:  wontfix
>  >  Keywords:                                        |
>  > ---------------------------------------------------+----------
>  > --------------
>  >
>  >
>  > Comment (by loc duong ding):
>  >
>  >  {{{
>  >
>  >
>  >  Yes, this error happens repeatably.
>  >
>  >  Best,
>  >  Loc Duong Dinh
>  >
>  > >---------------------------------------------------+---------
>  > ---------------
>  >  >Reporter:  loc duong ding <mambom1902@yahoo.com>  |
> Owner:
>  >    >  Type:  bug                                    |
>  > Status:  closed
>  >  >Priority:  major                                  |

> Milestone:
>  >  ><Component:  mpich2                                |
> Resolution:
>  >  wontfix.
>  >  >Keywords:                                        |
>  >
>  > ---------------------------------------------------+----------
>  > --------------
>  >
>  >
>  >  >Comment (by Rajeev Thakur):
>  >
>  >  >{{{
>  >
>  >  >Does that same error happen repeatably?
>  >
>  >  >Rajeev
>  >
>  >  >> -----Original Message-----
>  >  >> From: mpich2-bugs-bounces@mcs.anl.gov
>  >  >> [mailto:mpich2-bugs-bounces@mcs.anl.gov] On Behalf Of mpich2
>  >  >> Sent: Monday, December 08, 2008 6:50 AM
>  >  >> To: undisclosed-recipients:
>  >  >> Subject: Re: [mpich2-maint] #270: Mpich2 running error
>  >  >>
>  >  >> ---------------------------------------------------+----------
>  >  >> --------------
>  >  >>  Reporter:  loc duong ding <mambom1902@yahoo.com>  |
>  >  >> Owner:
>  >  >>      Type:  bug
  |
>  >  >> Status:  closed
>  >  >>  Priority:  major                                  |
>  >  >> Milestone:
>  >  >> Component:  mpich2                                |
>  >  >> Resolution:  wontfix
>  >  >>  Keywords:
|
>  >  >> ---------------------------------------------------+----------
>  >  >> --------------
>  >  >>
>  >  >
>  >  >> Comment (by loc duong ding):
>  >  >>
>  >  >  >{{{
>  >  >>
>  >  >  >Dear,
>  >  >  I see this problem again and I check the error. The error in
>  >  > the out file
>  >  >  is:
>  >  >
>  >  >   total energy              =  -313.00559496 Ry
>  >  >       Harris-Foulkes estimate   =  -314.62641196 Ry
>  >  >       estimated scf accuracy    <     0.04292274 Ry
>  >  >       total magnetization       =     2.00 Bohr mag/cell
>  >  >       absolute magnetization    =     3.41 Bohr mag/cell
>  >  >       iteration #  2     ecut=    35.00 Ry
beta=0.60
>  >  >  mpiexec_master: mpd_uncaught_except_tb handling:
>  >  >    exceptions.IOError: [Errno 5] Input/output error
>  >  >      /opt/mpich2/bin/mpiexec  1051  handle_cli_stderr_input
>  >  >          sys.stderr.write(msg)
>  >  >      /opt/mpich2/bin/mpdlib.py  762  handle_active_streams
>  >  >          handler(stream,*args)
>  >  >      /opt/mpich2/bin/mpiexec  515  mpiexec
>  >  >          rv = streamHandler.handle_active_streams(timeout=1.0)
>  >  >      /opt/mpich2/bin/mpiexec  1423  ?
>  >  >          mpiexec()
>  >  >
>  >  >
>  >  >  This error interprut the running process.
>  >  >  I don't know where the error comes from and how I can solve
>  >  > this problem.
>  >  >  Please
>  >  >  help me.
>  >  >
>  >  >  Best regards,
>  >  >  Loc Duong Dinh
>  >  >  SAINT, Sungkyunkwan University, South Korean.
>  >  >
>  >  > ---------------------------------------------------+----------
>  >  > --------------
>  >  >  >Reporter:  loc duong ding <mambom1902@yahoo.com>  |
> Owner:
>  >  >    >  Type:  bug
  |
>  >  > Status:  new
>  >  >  >Priority:  major
|
> Milestone:
>  >  >  >Component:  mpich2
|
> Resolution:
>  >  >  >Keywords:
|
>  >  >
>  >  > ---------------------------------------------------+----------
>  >  > --------------
>  >  >
>  >  >
>  >  >  >Comment (by Pavan Balaji):
>  >  >
>  >  >  >{{{
>  >  >
>  >  >
>  >  >  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) - process
> 1
>  >  >  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) -
>  >  > process 3[cli_3]:
>  >  >  >>  aborting
>  >  >  >>  job                  :
>  >  >
>  >  >  >As the error message suggests, your application is calling
>  >  > MPI_Abort.
>  >  >  >You can search through your code to see why it's causing
>  > MPI_Abort.
>  >  >
>  >  >    >-- Pavan
>  >  >
>  >  >  --
>  >  >  >Pavan Balaji
>  >  >  >http://www.mcs.anl.gov/~balaji
>  >  >  >}}}
>  >  >
>  >  >  --
>  >  >  >Ticket URL:
>  >  >  <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >  >
>  >  >
>  >  >  ________________________________
>  >  >
>  >  >  >From: mpich2 <mpich2-maint@mcs.anl.gov>
>  >  >  >Sent: Thursday, November 6, 2008 1:03:22 AM
>  >  >  >Subject: Re: [mpich2-maint] #270: Mpich2 running error
>  >  >
>  >  >
>  >  >  }}}
>  >  >
>  >  > --
>  >  > Ticket URL:
>  >  >> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >  >>
>  >
>  >  }}}
>  >
>  >  --
>  >  Ticket URL:
>  > <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >
>  >
>  >
>  >  }}}
>  >
>  > --
>  > Ticket URL:
>  > <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >
>
>  }}}
>
> --
> Ticket URL:
> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>

--
Ticket URL: <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>

}}}

comment:10 Changed 5 years ago by Anthony Chan


Are you saying that PWscf that failed internally at
cholesky decomposition with your input data file and
then it causes mpd to fail ?  We don't have the
resources to debug PWscf for you, you need to be more
specific what problem that MPICH2 fails to do...

A.Chan

----- "mpich2" <mpich2-maint@mcs.anl.gov> wrote:

> ---------------------------------------------------+------------------------
>  Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:
>
>      Type:  bug                                    |       Status:
> closed
>  Priority:  major                                  |    Milestone:
>
> Component:  mpich2                                 |   Resolution:
> wontfix
>  Keywords:                                         |
> ---------------------------------------------------+------------------------
>
>
> Comment (by loc duong ding):
>
>  {{{
>
>  I checked this error carreffully again and found another information.
> THe
>  CRASH
>  file of code is:
>
>
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
>       task #         6
>       from  pzpotrf  : error #        54
>        problems computing cholesky decomposition
>
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
>
>  I search from Espresso forum and there is some explanation for this
> error.
>  http://www.democritos.it/pipermail/pw_forum/2008-June.txt
>  It may be not the error of MPICH2. But I don't sure about that.
>  If you want, I can give you my input file of this package which have
> the
>  problem.
>  You can download the package in:   http://www.pwscf.org/download.htm
>
>  This is my input file of this code:
>
>  &CONTROL
>        calculation = 'vc-relax',
>        prefix='C8OOH_model11_vcrelax',
>        restart_mode = 'restart',
>        pseudo_dir ='/home/loc/espresso-4.0/pseudo',
>        outdir='./'
>        tstress = .true. ,
>        tprnfor = .true. ,
>        nstep =  100  ,
>        etot_conv_thr = 1.0E-4 ,
>        forc_conv_thr = 1.0D-3 ,
>        dt = 100 ,
>  /
>  &SYSTEM
>        ibrav= 4,  celldm(1) = 9.6561, celldm(3)=2.6223, nat = 22, ntyp
> = 3,
>  nspin=2,
>        ecutwfc =35, ecutrho = 210, occupations='smearing',
> degauss=0.0001,
>        starting_magnetization(1) = 0.0,
>        starting_magnetization(2) = 0.3,
>        starting_magnetization(3) = 0.5
>  /
>  &ELECTRONS
>      startingwfc = 'atomic'
>      mixing_mode = 'plain'
>      mixing_beta = 0.6
>      conv_thr = 1.0e-6
>      electron_maxstep= 150
>  /
>  &IONS
>      upscale = 15
>  /
>  &CELL
>     cell_dynamics = 'bfgs' ,
>     press = 0.00 ,
>     wmass =  0.00150000  ,
>  /
>  ATOMIC_SPECIES
>   C  12.011  C.pbe-rrkjus.UPF
>   O  15.9994 O.pbe-rrkjus.UPF
>   H  1.008   H.pbe-rrkjus.UPF
>  ATOMIC_POSITIONS {angstrom}
>  C       14.668639195  16.364791021  10.258918111
>  C       15.937254439  17.107960454  10.286714929
>  C       17.164561436  16.396716693  10.285152277
>  C       15.953243711  18.584610189  10.263683339
>  C       18.455194677  17.125292198   9.923043528
>  C       17.158408500  19.342595249  10.278747473
>  C       18.439908989  18.574325910  10.451748908
>  C       19.749065047  19.329036627  10.467833076
>  O       19.073785361  18.834350939  11.714491175
>  O       18.359047912  17.228522459   8.462096728
>  H       19.250644091  17.418265856   8.101422280
>  C       15.887265182  17.078752420  16.941524968
>  C       17.155880835  17.821106817  16.971572207
>  C       18.382991387  17.109537508  16.971809032
>  C       17.172522625  19.298060392  16.947640633
>  C       19.673132113  17.837864910  16.603847009
>  C       18.376934058  20.055968139  16.960975322
>  C       19.658524761  19.287069550  17.130483393
>  C       20.967474043  20.041325812  17.152120359
>  O       20.288654034  19.543055555  18.395941555
>  O       19.571576169  17.935034850  15.143694987
>  H       20.458291435  18.131756450  14.774538711
>  K_POINTS {automatic}
>  4 4 1 0 0 0
>
>  If it is not the error of MPICH2, I am really sorry for bothering
> you
>  about my
>  confusions.
>  Thank for all.
>
>  Best regards,
>  Loc Duong Dinh.
>
>  ________________________________
>  >From: mpich2 <mpich2-maint@mcs.anl.gov>
>  >Sent: Tuesday, December 9, 2008 2:35:10 PM
>  >Subject: Re: [mpich2-maint] #270: Mpich2 running error
>
>
> ---------------------------------------------------+------------------------
>  >Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:
>   >   Type:  bug                                    | 
    Status: 
> closed
>  >Priority:  major                                  |   
Milestone:
>  >Component:  mpich2                                | 
Resolution: 
> wontfix
>  >Keywords:                                        |
>
> ---------------------------------------------------+------------------------
>
>
>  >Comment (by Anthony Chan):
>
>  >{{{
>
>
>  >Since it is a repeatable I/O related issue, would
>  >you be able to send us a small program that
>  >illustrates the problem ?
>
>  >A.Chan
>  >----- "mpich2" <mpich2-maint@mcs.anl.gov> wrote:
>
>  >
>
> ---------------------------------------------------+------------------------
>  >  Reporter:  loc duong ding <mambom1902@yahoo.com>  |        Owner:
>  >
>  >      Type:  bug                                   
|      Status:
>  > closed
>  >  Priority:  major                                  | 
  Milestone:
>  >
>  > Component:  mpich2                                | 
Resolution:
>  > wontfix
>  >  Keywords:                                        |
>  >
>
> ---------------------------------------------------+------------------------
>  >
>  >
>  > Comment (by Rajeev Thakur):
>  >
>  >  {{{
>  >
>  >  Hard to say from this output what the problem might be.
>  >
>  >  Rajeev
>  >
>  >  > -----Original Message-----
>  >  > From: mpich2-bugs-bounces@mcs.anl.gov
>  >  > [mailto:mpich2-bugs-bounces@mcs.anl.gov] On Behalf Of mpich2
>  >  > Sent: Monday, December 08, 2008 6:05 PM
>  >  > To: undisclosed-recipients:
>  >  > Subject: Re: [mpich2-maint] #270: Mpich2 running error
>  >  >
>  >  > ---------------------------------------------------+----------
>  >  > --------------
>  >  >  Reporter:  loc duong ding <mambom1902@yahoo.com>  |
>  >  > Owner:
>  >  >      Type:  bug                                 
  |
>  >  > Status:  closed
>  >  >  Priority:  major                                  |
>  >  > Milestone:
>  >  > Component:  mpich2                                |
>  >  > Resolution:  wontfix
>  >  >  Keywords:                                       
|
>  >  > ---------------------------------------------------+----------
>  >  > --------------
>  >  >
>  >  >
>  >  > Comment (by loc duong ding):
>  >  >
>  >  >  {{{
>  >  >
>  >  >
>  >  >  Yes, this error happens repeatably.
>  >  >
>  >  >  Best,
>  >  >  Loc Duong Dinh
>  >  >
>  >  > >---------------------------------------------------+---------
>  >  > ---------------
>  >  >  >Reporter:  loc duong ding <mambom1902@yahoo.com>  |
>  > Owner:
>  >  >    >  Type:  bug                                 
  |
>  >  > Status:  closed
>  >  >  >Priority:  major                                 
|
>
>  > Milestone:
>  >  >  ><Component:  mpich2                               
|
>  > Resolution:
>  >  >  wontfix.
>  >  >  >Keywords:                                       
|
>  >  >
>  >  > ---------------------------------------------------+----------
>  >  > --------------
>  >  >
>  >  >
>  >  >  >Comment (by Rajeev Thakur):
>  >  >
>  >  >  >{{{
>  >  >
>  >  >  >Does that same error happen repeatably?
>  >  >
>  >  >  >Rajeev
>  >  >
>  >  >  >> -----Original Message-----
>  >  >  >> From: mpich2-bugs-bounces@mcs.anl.gov
>  >  >  >> [mailto:mpich2-bugs-bounces@mcs.anl.gov] On Behalf Of
> mpich2
>  >  >  >> Sent: Monday, December 08, 2008 6:50 AM
>  >  >  >> To: undisclosed-recipients:
>  >  >  >> Subject: Re: [mpich2-maint] #270: Mpich2 running error
>  >  >  >>
>  >  >  >>
> ---------------------------------------------------+----------
>  >  >  >> --------------
>  >  >  >>  Reporter:  loc duong ding <mambom1902@yahoo.com>  |
>  >  >  >> Owner:
>  >  >  >>      Type:  bug
>    |
>  >  >  >> Status:  closed
>  >  >  >>  Priority:  major                               
  |
>  >  >  >> Milestone:
>  >  >  >> Component:  mpich2                               
|
>  >  >  >> Resolution:  wontfix
>  >  >  >>  Keywords:
>  |
>  >  >  >>
> ---------------------------------------------------+----------
>  >  >  >> --------------
>  >  >  >>
>  >  >  >
>  >  >  >> Comment (by loc duong ding):
>  >  >  >>
>  >  >  >  >{{{
>  >  >  >>
>  >  >  >  >Dear,
>  >  >  >  I see this problem again and I check the error. The error
> in
>  >  >  > the out file
>  >  >  >  is:
>  >  >  >
>  >  >  >   total energy              =  -313.00559496 Ry
>  >  >  >       Harris-Foulkes estimate   =  -314.62641196 Ry
>  >  >  >       estimated scf accuracy    <     0.04292274 Ry
>  >  >  >       total magnetization       =     2.00 Bohr
mag/cell
>  >  >  >       absolute magnetization    =     3.41 Bohr
mag/cell
>  >  >  >       iteration #  2     ecut=    35.00 Ry
>  beta=0.60
>  >  >  >  mpiexec_master: mpd_uncaught_except_tb handling:
>  >  >  >    exceptions.IOError: [Errno 5] Input/output error
>  >  >  >      /opt/mpich2/bin/mpiexec  1051  handle_cli_stderr_input
>  >  >  >          sys.stderr.write(msg)
>  >  >  >      /opt/mpich2/bin/mpdlib.py  762  handle_active_streams
>  >  >  >          handler(stream,*args)
>  >  >  >      /opt/mpich2/bin/mpiexec  515  mpiexec
>  >  >  >          rv =
> streamHandler.handle_active_streams(timeout=1.0)
>  >  >  >      /opt/mpich2/bin/mpiexec  1423  ?
>  >  >  >          mpiexec()
>  >  >  >
>  >  >  >
>  >  >  >  This error interprut the running process.
>  >  >  >  I don't know where the error comes from and how I can solve
>  >  >  > this problem.
>  >  >  >  Please
>  >  >  >  help me.
>  >  >  >
>  >  >  >  Best regards,
>  >  >  >  Loc Duong Dinh
>  >  >  >  SAINT, Sungkyunkwan University, South Korean.
>  >  >  >
>  >  >  >
> ---------------------------------------------------+----------
>  >  >  > --------------
>  >  >  >  >Reporter:  loc duong ding <mambom1902@yahoo.com>  |
>  > Owner:
>  >  >  >    >  Type:  bug
>    |
>  >  >  > Status:  new
>  >  >  >  >Priority:  major
>  |
>  > Milestone:
>  >  >  >  >Component:  mpich2
>  |
>  > Resolution:
>  >  >  >  >Keywords:
>  |
>  >  >  >
>  >  >  >
> ---------------------------------------------------+----------
>  >  >  > --------------
>  >  >  >
>  >  >  >
>  >  >  >  >Comment (by Pavan Balaji):
>  >  >  >
>  >  >  >  >{{{
>  >  >  >
>  >  >  >
>  >  >  >  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) -
> process
>  > 1
>  >  >  >  >>  application called MPI_Abort(MPI_COMM_WORLD, 0) -
>  >  >  > process 3[cli_3]:
>  >  >  >  >>  aborting
>  >  >  >  >>  job                  :
>  >  >  >
>  >  >  >  >As the error message suggests, your application is calling
>  >  >  > MPI_Abort.
>  >  >  >  >You can search through your code to see why it's causing
>  >  > MPI_Abort.
>  >  >  >
>  >  >  >    >-- Pavan
>  >  >  >
>  >  >  >  --
>  >  >  >  >Pavan Balaji
>  >  >  >  >http://www.mcs.anl.gov/~balaji
>  >  >  >  >}}}
>  >  >  >
>  >  >  >  --
>  >  >  >  >Ticket URL:
>  >  >  > 
> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >  >  >
>  >  >  >
>  >  >  >  ________________________________
>  >  >  >
>  >  >  >  >From: mpich2 <mpich2-maint@mcs.anl.gov>
>  >  >  >  >Sent: Thursday, November 6, 2008 1:03:22 AM
>  >  >  >  >Subject: Re: [mpich2-maint] #270: Mpich2 running error
>  >  >  >
>  >  >  >
>  >  >  >  }}}
>  >  >  >
>  >  >  > --
>  >  >  > Ticket URL:
>  >  >  >>
> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >  >  >>
>  >  >
>  >  >  }}}
>  >  >
>  >  >  --
>  >  >  Ticket URL:
>  >  > <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >  >
>  >  >
>  >  >
>  >  >  }}}
>  >  >
>  >  > --
>  >  > Ticket URL:
>  >  > <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  >  >
>  >
>  >  }}}
>  >
>  > --
>  > Ticket URL:
>  > <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>  }}}
>
>  --
>  Ticket URL:
> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>
>
>
>  }}}
>
> --
> Ticket URL:
> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>

Changed 5 years ago by loc duong ding

Added by email2trac

comment:11 Changed 5 years ago by loc duong ding

Dear Mpich Adviser,
I had be successful  to install MPICH2 with gfortran. Now I try to compile with
Intel compiler to install VASP code.
I tried install MPICH2(1.0.7) with Intel compiler (10.1.018). I have some
errors.
The first error is:

Command:
$./configure --prefix=/opt/mpich2 CC=icc CXX=icpc CFLAGS=-O3-wW-tpp7  FC=ifort
F77=ifort F90=ifort FFLAGS=-O3-wW-tpp7 --disable-mpe --disable-romio --enable-
fast
Configuring MPICH2 version 1.0.7 with '--prefix=/opt/mpich2' 'CC=icc' 'CXX=icpc'
'CFLAGS=-O3-wW-tpp7' 'FC=ifort' 'F77=ifort' 'F90=ifort' 'FFLAGS=-O3-wW-tpp7'
'--disable-mpe' '--disable-romio' '--enable-fast'
Running on system: Linux loc 2.6.27.5-117.fc10.x86_64 #1 SMP Tue Nov 18 11:58:53
EST 2008 x86_64 x86_64 x86_64 GNU/Linux
Executing mpich2prereq in /home/loc/back_up/mpich2-1.0.7/src/mpid/ch3 with
Executing mpich2prereq in
/home/loc/back_up/mpich2-1.0.7/src/mpid/ch3/channels/sock
sourcing /home/loc/back_up/mpich2-1.0.7/src/pm/mpd/mpich2prereq
sourcing /home/loc/back_up/mpich2-1.0.7/src/pm/mpd/setup_pm
checking for gcc... icc
checking for C compiler default output file name... a.out
checking whether the C compiler works... yes
checking whether we are cross compiling... no
checking for suffix of executables...
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether icc accepts -g... yes
checking for icc option to accept ANSI C... none needed
checking whether C compiler accepts option -O2... yes
checking whether routines compiled with -O2 can be linked with ones compiled
without -O2... yes
checking for type of weak symbol support... pragma weak
checking whether __attribute__ ((weak)) allowed... yes
checking for multiple weak symbol support... yes
checking whether we are using the GNU Fortran 77 compiler... no
checking whether ifort accepts -g... yes
checking whether Fortran 77 compiler accepts option -O2... yes
checking whether routines compiled with -O2 can be linked with ones compiled
without -O2... yes
checking how to get verbose linking output from ifort... -v
checking for Fortran libraries of ifort...  -L/opt/intel/fce/10.1.018/lib
-L/opt/gcc/lib/gcc/x86_64-unknown-linux-gnu/4.3.1/ -L/opt/gcc/lib/gcc/x86_64
-unknown-linux-gnu/4.3.1/../../../../lib64 -L/usr/lib/../lib64 -lifport -lifcore
-limf -lsvml -lm -lipgo -lirc -lgcc_s -lirc_s -ldl
checking whether ifort accepts the FLIBS found by autoconf... yes
checking whether C can link with  -L/opt/intel/fce/10.1.018/lib
-L/opt/gcc/lib/gcc/x86_64-unknown-linux-gnu/4.3.1/ -L/opt/gcc/lib/gcc/x86_64
-unknown-linux-gnu/4.3.1/../../../../lib64 -L/usr/lib/../lib64 -lifport -lifcore
-limf -lsvml -lm -lipgo -lirc -lgcc_s -lirc_s -ldl... yes
checking for linker for Fortran main programs... Use Fortran to link programs
checking for Fortran 77 name mangling... lower underscore
checking what libraries are needed to link Fortran programs with C routines that
use stdio... none
checking that f works as the extension for Fortran 90 program... yes
checking whether we are using the GNU Fortran 90 compiler... no
checking whether ifort accepts -g... yes
checking for extension for Fortran 90 programs... f90
checking whether the Fortran 90 compiler (ifort  -DNDEBUG ) works... yes
checking whether the Fortran 90 compiler (ifort  -DNDEBUG ) is a cross-
compiler... no
checking whether Fortran 90 works with Fortran 77... yes
checking whether Fortran accepts ! for comments... yes
checking for include directory flag for Fortran... -I
checking for Fortran 77 flag for library directories... -L
checking for which Fortran libraries are needed to link C with Fortran... none
checking whether Fortran compiler processes .F files with C preprocessor... yes
checking that f works as the extension for Fortran 90 program... yes
checking whether we are using the GNU Fortran 90 compiler... (cached) no
checking whether ifort accepts -g... (cached) yes
checking for extension for Fortran 90 programs... f90
checking whether the Fortran 90 compiler (ifort  -DNDEBUG ) works... yes
checking whether the Fortran 90 compiler (ifort  -DNDEBUG ) is a cross-
compiler... no
checking for Fortran 90 module extension... mod
checking for Fortran 90 module include flag... -I
checking whether Fortran 90 accepts f90 suffix... yes
checking whether Fortran 90 compiler accepts option -O2... yes
checking whether routines compiled with -O2 can be linked with ones compiled
without -O2... yes
checking whether Fortran 90 compiler processes .F90 files with C preprocessor...
yes
checking what libraries are needed to link Fortran90 programs with C routines
that use stdio... none
checking for f90 compiler vendor... intel
checking for c++... icpc
checking whether we are using the GNU C++ compiler... yes
checking whether icpc accepts -g... yes
checking whether the C++ compiler icpc can build an executable... yes
checking whether the compiler supports exceptions... yes
checking whether the compiler recognizes bool as a built-in type... yes
checking whether the compiler implements namespaces... yes
checking whether <iostream> available... yes
checking whether the compiler implements the namespace std... yes
checking whether <math> available... no
checking for GNU g++ version... 4 . 3
checking whether C++ compiler accepts option -O2... yes
checking whether routines compiled with -O2 can be linked with ones compiled
without -O2... yes
checking for perl... /usr/bin/perl
checking for ar... ar
checking for ranlib... ranlib
checking for etags... no
checking for killall... killall
checking for a BSD-compatible install... /usr/bin/install -c
checking whether install works... yes
checking whether install breaks libraries... no
checking whether mkdir -p works... yes
checking for make... make
checking whether clock skew breaks make... no
checking whether make supports include... yes
checking whether make allows comments in actions... yes
checking for virtual path format... VPATH
checking whether make sets CFLAGS... yes
checking for bash... /bin/sh
checking whether /bin/sh supports arrays... yes
checking for doctext... false
checking for location of doctext style files... unavailable
checking for an ANSI C-conforming const... yes
checking for volatile... yes
checking for restrict... __restrict
checking for inline... inline
checking whether __attribute__ allowed... yes
checking whether __attribute__((format)) allowed... yes
checking whether byte ordering is bigendian... no
checking whether C compiler allows unaligned doubles... yes
checking whether icc supports __func__... yes
Using gcc to determine dependencies
checking whether long double is supported... yes
checking whether long long is supported... yes
checking for max C struct integer alignment... eight
checking for max C struct floating point alignment... sixteen
checking for max C struct alignment of structs with doubles... eight
checking for max C struct floating point alignment with long doubles... sixteen
configure: WARNING: Structures containing long doubles may be aligned
differently from structures with floats or longs.  MPICH2 does not handle this
case automatically and you should avoid assumed extents for structures
containing float types.
checking if alignment of structs with doubles is based on position... no
checking if alignment of structs with long long ints is based on position... no
checking if double alignment breaks rules, find actual alignment... no
checking for alignment restrictions on pointers... int or better
checking for egrep... grep -E
checking for ANSI C header files... no
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking for char... yes
checking size of char... 1
checking for short... yes
checking size of short... 2
checking for int... yes
checking size of int... 4
checking for long... yes
checking size of long... 8
checking for long long... yes
checking size of long long... 8
checking for float... yes
checking size of float... 4
checking for double... yes
checking size of double... 8
checking for long double... yes
checking size of long double... 16
checking for wchar_t... yes
checking size of wchar_t... 4
checking for void *... yes
checking size of void *... 8
checking for size of float int... 8
checking for size of double int... 12
checking for size of long int... 12
checking for size of short int... 6
checking for size of 2 int... 8
checking for size of long double int... 20
checking for sys/bitypes.h... yes
checking for int16_t... yes
checking for int32_t... yes
checking for int64_t... yes
checking for size of Fortran type integer... 4
checking for size of Fortran type real... 4
checking for size of Fortran type double precision... 8
checking whether integer*1 is supported... yes
checking whether integer*2 is supported... yes
checking whether integer*4 is supported... yes
checking whether integer*8 is supported... yes
checking whether integer*16 is supported... no
checking whether real*4 is supported... yes
checking whether real*8 is supported... yes
checking whether real*16 is supported... yes
checking for C type matching Fortran integer... int
checking for size of MPI_Status... 20
checking for values of Fortran logicals... True is -1 and False is 0
checking how to run the C preprocessor... /lib/cpp
checking for Fortran 90 integer kind for 8-byte integers... 8
checking for bool... yes
checking size of bool... 1
checking how to run the C++ preprocessor... /lib/cpp
checking for complex... no
checking if char * pointers use byte addresses... yes
checking for alignment restrictions on int64_t... no
checking for alignment restrictions on int32_t... no
checking for size of MPI_BSEND_OVERHEAD... 95
checking for gcc __asm__ and pentium cmpxchgl instruction... no
checking for gcc __asm__ and AMD x86_64 cmpxchgq instruction... yes
checking for gcc __asm__ and IA64 xchg4 instruction... no
checking for ANSI C header files... (cached) no
checking for stdlib.h... (cached) yes
checking for stdarg.h... yes
checking for sys/types.h... (cached) yes
checking for inttypes.h... (cached) yes
checking for limits.h... no
checking for stddef.h... yes
checking for errno.h... yes
checking for sys/socket.h... yes
checking for sys/time.h... yes
checking for unistd.h... (cached) yes
checking for endian.h... yes
checking for assert.h... yes
checking for sys/uio.h... yes
checking for size_t... yes
checking for setitimer... no
checking for alarm... no
checking for vsnprintf... no
checking for vsprintf... no
checking for strerror... no
checking for snprintf... no
checking for va_copy... yes
checking for working alloca.h... yes
checking for alloca... yes
checking for strdup... no
checking for clock_gettime... no
checking for clock_getres... no
checking for gethrtime... no
checking for gettimeofday... no
configure: error: No timer found

After that, I tried enable timer:
Error in compile MPICH2
Command:
$./configure --prefix=/opt/mpich2 CC=icc CXX=icpc CFLAGS=-O3-wW-tpp7  FC=ifort
F77=ifort F90=ifort FFLAGS=-O3-wW-tpp7 --enable-timer-type=linux86_cycle
--disable-mpe --disable-romio --enable-fast
........
checking for C compiler default output file name... a.out
checking whether the C compiler works... yes
checking whether we are cross compiling... no
checking for suffix of executables...
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether icc accepts -g... yes
checking for icc option to accept ANSI C... none needed
checking for inline... inline
checking for poll... no
configure: error: This device requires the poll function
configure: error: /bin/sh
'/home/loc/back_up/mpich2-1.0.7/src/mpid/common/sock/poll/configure' failed for
poll
configure: error: Configure of src/mpid/common/sock failed!

How can I solve this problem? I am very appreciate  for your helps.

Best regards,
Loc Duong Dinh
SAINT, Sungkyunkwan University, South Korean.
mambom1902@yahoo.com


comment:12 Changed 5 years ago by Rajeev Thakur

Can you compile this simple program with icc?

#include <limits.h>

main() {
    return 1;
}


If not, are you using gcc 4.3.x on your system? There seems to be some
incompatibility between Intel 10.x compilers and gcc 4.3.x. See
http://software.intel.com/en-us/forums/intel-c-compiler/topic/59886 . (icc
uses some gcc header files.)

You could try using gcc 4.2.x with Intel 10.x. Or you could compile MPICH2
using the Intel Fortran compiler and GNU compilers for C and C++.

Rajeev


> -----Original Message-----
> From: mpich2-bugs-bounces@mcs.anl.gov
> [mailto:mpich2-bugs-bounces@mcs.anl.gov] On Behalf Of mpich2
> Sent: Friday, December 26, 2008 1:12 AM
> To: undisclosed-recipients:
> Subject: Re: [mpich2-maint] #270: Mpich2 running error
>
> ---------------------------------------------------+----------
> --------------
>  Reporter:  loc duong ding <mambom1902@yahoo.com>  |
> Owner:
>      Type:  bug                                    |
> Status:  closed
>  Priority:  major                                  |
> Milestone:
> Component:  mpich2                                 |
> Resolution:  wontfix
>  Keywords:                                         |
> ---------------------------------------------------+----------
> --------------
>
>
> Comment (by loc duong ding):
>
>  {{{
>
>  Dear Mpich Adviser,
>  I had be successful  to install MPICH2 with gfortran. Now I
> try to compile
>  with
>  Intel compiler to install VASP code.
>  I tried install MPICH2(1.0.7) with Intel compiler
> (10.1.018). I have some
>  errors.
>  The first error is:
>
>  Command:
>  $./configure --prefix=/opt/mpich2 CC=icc CXX=icpc CFLAGS=-O3-wW-tpp7
>  FC=ifort
>  F77=ifort F90=ifort FFLAGS=-O3-wW-tpp7 --disable-mpe --disable-romio
>  --enable-
>  fast
>  Configuring MPICH2 version 1.0.7 with '--prefix=/opt/mpich2' 'CC=icc'
>  'CXX=icpc'
>  'CFLAGS=-O3-wW-tpp7' 'FC=ifort' 'F77=ifort' 'F90=ifort'
> 'FFLAGS=-O3-wW-
>  tpp7'
>  '--disable-mpe' '--disable-romio' '--enable-fast'
>  Running on system: Linux loc 2.6.27.5-117.fc10.x86_64 #1 SMP
> Tue Nov 18
>  11:58:53
>  EST 2008 x86_64 x86_64 x86_64 GNU/Linux
>  Executing mpich2prereq in
> /home/loc/back_up/mpich2-1.0.7/src/mpid/ch3 with
>  Executing mpich2prereq in
>  /home/loc/back_up/mpich2-1.0.7/src/mpid/ch3/channels/sock
>  sourcing /home/loc/back_up/mpich2-1.0.7/src/pm/mpd/mpich2prereq
>  sourcing /home/loc/back_up/mpich2-1.0.7/src/pm/mpd/setup_pm
>  checking for gcc... icc
>  checking for C compiler default output file name... a.out
>  checking whether the C compiler works... yes
>  checking whether we are cross compiling... no
>  checking for suffix of executables...
>  checking for suffix of object files... o
>  checking whether we are using the GNU C compiler... yes
>  checking whether icc accepts -g... yes
>  checking for icc option to accept ANSI C... none needed
>  checking whether C compiler accepts option -O2... yes
>  checking whether routines compiled with -O2 can be linked with ones
>  compiled
>  without -O2... yes
>  checking for type of weak symbol support... pragma weak
>  checking whether __attribute__ ((weak)) allowed... yes
>  checking for multiple weak symbol support... yes
>  checking whether we are using the GNU Fortran 77 compiler... no
>  checking whether ifort accepts -g... yes
>  checking whether Fortran 77 compiler accepts option -O2... yes
>  checking whether routines compiled with -O2 can be linked with ones
>  compiled
>  without -O2... yes
>  checking how to get verbose linking output from ifort... -v
>  checking for Fortran libraries of ifort...
> -L/opt/intel/fce/10.1.018/lib
>  -L/opt/gcc/lib/gcc/x86_64-unknown-linux-gnu/4.3.1/
>  -L/opt/gcc/lib/gcc/x86_64
>  -unknown-linux-gnu/4.3.1/../../../../lib64
> -L/usr/lib/../lib64 -lifport
>  -lifcore
>  -limf -lsvml -lm -lipgo -lirc -lgcc_s -lirc_s -ldl
>  checking whether ifort accepts the FLIBS found by autoconf... yes
>  checking whether C can link with  -L/opt/intel/fce/10.1.018/lib
>  -L/opt/gcc/lib/gcc/x86_64-unknown-linux-gnu/4.3.1/
>  -L/opt/gcc/lib/gcc/x86_64
>  -unknown-linux-gnu/4.3.1/../../../../lib64
> -L/usr/lib/../lib64 -lifport
>  -lifcore
>  -limf -lsvml -lm -lipgo -lirc -lgcc_s -lirc_s -ldl... yes
>  checking for linker for Fortran main programs... Use Fortran to link
>  programs
>  checking for Fortran 77 name mangling... lower underscore
>  checking what libraries are needed to link Fortran programs with C
>  routines that
>  use stdio... none
>  checking that f works as the extension for Fortran 90 program... yes
>  checking whether we are using the GNU Fortran 90 compiler... no
>  checking whether ifort accepts -g... yes
>  checking for extension for Fortran 90 programs... f90
>  checking whether the Fortran 90 compiler (ifort  -DNDEBUG )
> works... yes
>  checking whether the Fortran 90 compiler (ifort  -DNDEBUG )
> is a cross-
>  compiler... no
>  checking whether Fortran 90 works with Fortran 77... yes
>  checking whether Fortran accepts ! for comments... yes
>  checking for include directory flag for Fortran... -I
>  checking for Fortran 77 flag for library directories... -L
>  checking for which Fortran libraries are needed to link C
> with Fortran...
>  none
>  checking whether Fortran compiler processes .F files with C
>  preprocessor... yes
>  checking that f works as the extension for Fortran 90 program... yes
>  checking whether we are using the GNU Fortran 90 compiler...
> (cached) no
>  checking whether ifort accepts -g... (cached) yes
>  checking for extension for Fortran 90 programs... f90
>  checking whether the Fortran 90 compiler (ifort  -DNDEBUG )
> works... yes
>  checking whether the Fortran 90 compiler (ifort  -DNDEBUG )
> is a cross-
>  compiler... no
>  checking for Fortran 90 module extension... mod
>  checking for Fortran 90 module include flag... -I
>  checking whether Fortran 90 accepts f90 suffix... yes
>  checking whether Fortran 90 compiler accepts option -O2... yes
>  checking whether routines compiled with -O2 can be linked with ones
>  compiled
>  without -O2... yes
>  checking whether Fortran 90 compiler processes .F90 files with C
>  preprocessor...
>  yes
>  checking what libraries are needed to link Fortran90 programs with C
>  routines
>  that use stdio... none
>  checking for f90 compiler vendor... intel
>  checking for c++... icpc
>  checking whether we are using the GNU C++ compiler... yes
>  checking whether icpc accepts -g... yes
>  checking whether the C++ compiler icpc can build an executable... yes
>  checking whether the compiler supports exceptions... yes
>  checking whether the compiler recognizes bool as a built-in
> type... yes
>  checking whether the compiler implements namespaces... yes
>  checking whether <iostream> available... yes
>  checking whether the compiler implements the namespace std... yes
>  checking whether <math> available... no
>  checking for GNU g++ version... 4 . 3
>  checking whether C++ compiler accepts option -O2... yes
>  checking whether routines compiled with -O2 can be linked with ones
>  compiled
>  without -O2... yes
>  checking for perl... /usr/bin/perl
>  checking for ar... ar
>  checking for ranlib... ranlib
>  checking for etags... no
>  checking for killall... killall
>  checking for a BSD-compatible install... /usr/bin/install -c
>  checking whether install works... yes
>  checking whether install breaks libraries... no
>  checking whether mkdir -p works... yes
>  checking for make... make
>  checking whether clock skew breaks make... no
>  checking whether make supports include... yes
>  checking whether make allows comments in actions... yes
>  checking for virtual path format... VPATH
>  checking whether make sets CFLAGS... yes
>  checking for bash... /bin/sh
>  checking whether /bin/sh supports arrays... yes
>  checking for doctext... false
>  checking for location of doctext style files... unavailable
>  checking for an ANSI C-conforming const... yes
>  checking for volatile... yes
>  checking for restrict... __restrict
>  checking for inline... inline
>  checking whether __attribute__ allowed... yes
>  checking whether __attribute__((format)) allowed... yes
>  checking whether byte ordering is bigendian... no
>  checking whether C compiler allows unaligned doubles... yes
>  checking whether icc supports __func__... yes
>  Using gcc to determine dependencies
>  checking whether long double is supported... yes
>  checking whether long long is supported... yes
>  checking for max C struct integer alignment... eight
>  checking for max C struct floating point alignment... sixteen
>  checking for max C struct alignment of structs with doubles... eight
>  checking for max C struct floating point alignment with long
> doubles...
>  sixteen
>  configure: WARNING: Structures containing long doubles may be aligned
>  differently from structures with floats or longs.  MPICH2
> does not handle
>  this
>  case automatically and you should avoid assumed extents for
> structures
>  containing float types.
>  checking if alignment of structs with doubles is based on
> position... no
>  checking if alignment of structs with long long ints is based on
>  position... no
>  checking if double alignment breaks rules, find actual
> alignment... no
>  checking for alignment restrictions on pointers... int or better
>  checking for egrep... grep -E
>  checking for ANSI C header files... no
>  checking for sys/types.h... yes
>  checking for sys/stat.h... yes
>  checking for stdlib.h... yes
>  checking for string.h... yes
>  checking for memory.h... yes
>  checking for strings.h... yes
>  checking for inttypes.h... yes
>  checking for stdint.h... yes
>  checking for unistd.h... yes
>  checking for char... yes
>  checking size of char... 1
>  checking for short... yes
>  checking size of short... 2
>  checking for int... yes
>  checking size of int... 4
>  checking for long... yes
>  checking size of long... 8
>  checking for long long... yes
>  checking size of long long... 8
>  checking for float... yes
>  checking size of float... 4
>  checking for double... yes
>  checking size of double... 8
>  checking for long double... yes
>  checking size of long double... 16
>  checking for wchar_t... yes
>  checking size of wchar_t... 4
>  checking for void *... yes
>  checking size of void *... 8
>  checking for size of float int... 8
>  checking for size of double int... 12
>  checking for size of long int... 12
>  checking for size of short int... 6
>  checking for size of 2 int... 8
>  checking for size of long double int... 20
>  checking for sys/bitypes.h... yes
>  checking for int16_t... yes
>  checking for int32_t... yes
>  checking for int64_t... yes
>  checking for size of Fortran type integer... 4
>  checking for size of Fortran type real... 4
>  checking for size of Fortran type double precision... 8
>  checking whether integer*1 is supported... yes
>  checking whether integer*2 is supported... yes
>  checking whether integer*4 is supported... yes
>  checking whether integer*8 is supported... yes
>  checking whether integer*16 is supported... no
>  checking whether real*4 is supported... yes
>  checking whether real*8 is supported... yes
>  checking whether real*16 is supported... yes
>  checking for C type matching Fortran integer... int
>  checking for size of MPI_Status... 20
>  checking for values of Fortran logicals... True is -1 and False is 0
>  checking how to run the C preprocessor... /lib/cpp
>  checking for Fortran 90 integer kind for 8-byte integers... 8
>  checking for bool... yes
>  checking size of bool... 1
>  checking how to run the C++ preprocessor... /lib/cpp
>  checking for complex... no
>  checking if char * pointers use byte addresses... yes
>  checking for alignment restrictions on int64_t... no
>  checking for alignment restrictions on int32_t... no
>  checking for size of MPI_BSEND_OVERHEAD... 95
>  checking for gcc __asm__ and pentium cmpxchgl instruction... no
>  checking for gcc __asm__ and AMD x86_64 cmpxchgq instruction... yes
>  checking for gcc __asm__ and IA64 xchg4 instruction... no
>  checking for ANSI C header files... (cached) no
>  checking for stdlib.h... (cached) yes
>  checking for stdarg.h... yes
>  checking for sys/types.h... (cached) yes
>  checking for inttypes.h... (cached) yes
>  checking for limits.h... no
>  checking for stddef.h... yes
>  checking for errno.h... yes
>  checking for sys/socket.h... yes
>  checking for sys/time.h... yes
>  checking for unistd.h... (cached) yes
>  checking for endian.h... yes
>  checking for assert.h... yes
>  checking for sys/uio.h... yes
>  checking for size_t... yes
>  checking for setitimer... no
>  checking for alarm... no
>  checking for vsnprintf... no
>  checking for vsprintf... no
>  checking for strerror... no
>  checking for snprintf... no
>  checking for va_copy... yes
>  checking for working alloca.h... yes
>  checking for alloca... yes
>  checking for strdup... no
>  checking for clock_gettime... no
>  checking for clock_getres... no
>  checking for gethrtime... no
>  checking for gettimeofday... no
>  configure: error: No timer found
>
>  After that, I tried enable timer:
>  Error in compile MPICH2
>  Command:
>  $./configure --prefix=/opt/mpich2 CC=icc CXX=icpc CFLAGS=-O3-wW-tpp7
>  FC=ifort
>  F77=ifort F90=ifort FFLAGS=-O3-wW-tpp7
> --enable-timer-type=linux86_cycle
>  --disable-mpe --disable-romio --enable-fast
>  ........
>  checking for C compiler default output file name... a.out
>  checking whether the C compiler works... yes
>  checking whether we are cross compiling... no
>  checking for suffix of executables...
>  checking for suffix of object files... o
>  checking whether we are using the GNU C compiler... yes
>  checking whether icc accepts -g... yes
>  checking for icc option to accept ANSI C... none needed
>  checking for inline... inline
>  checking for poll... no
>  configure: error: This device requires the poll function
>  configure: error: /bin/sh
>  '/home/loc/back_up/mpich2-1.0.7/src/mpid/common/sock/poll/configure'
>  failed for
>  poll
>  configure: error: Configure of src/mpid/common/sock failed!
>
>  How can I solve this problem? I am very appreciate  for your helps.
>
>  Best regards,
>  Loc Duong Dinh
>  SAINT, Sungkyunkwan University, South Korean.
>  mambom1902@yahoo.com
>
>
>  }}}
>
> --
> Ticket URL:
> <https://trac.mcs.anl.gov/projects/mpich2/ticket/270#comment:>
>

Note: See TracTickets for help on using tickets.