Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Search results for query: *

  1. FloatingFeather

    Problem with llapack and DGELS

    Ok. I have just solved the issue. There was a function call that had the same name as an internal function of llapack.
  2. FloatingFeather

    Problem with llapack and DGELS

    I have found that if I delete one subroutine in the program that is working wrong, then DGELS gives the correct result. Even if I don't call this subroutine (which is a big code which I have inserted inside my code to call), the problem occurs. The thing is that I actually need to call this...
  3. FloatingFeather

    Problem with llapack and DGELS

    Hi. I don't know what is causing this. I have written two different codes, and I wanted to use a subroutine I wrote, in both codes. The thing is that this subroutine calls DGELS to do some computations. But in one code it works fine, and in the other gives a wrong result. I don't really know...
  4. FloatingFeather

    Segmentation fault, mpi and gfortran

    Hi. Is it possible that I get the segmentation fault if somebody else is using the same node as I am using? the cluster is having problems, so I think it is a possibility that somebody else is sharing some activity in a node with me.
  5. FloatingFeather

    Error in cluster when trying to run in more than a single node

    Hi. The cluster where I work started to have problems after the old administrator who worked at it leave the job. I am trying to figure out what is causing that when a job is submitted on multiple nodes, problems appear. error: executing task of job 68 failed: execution daemon on host...
  6. FloatingFeather

    Segmentation fault, mpi and gfortran

    Hi. Thanks for your reply. When I run it locally I run on a 64 bit system (ryzen 7). Even though the cluster haven't allowed me to use fcheck-bounds, I could trace where the error was originated, it's quite weird. The error occurred when deallocating an array, but I still don't understand why...
  7. FloatingFeather

    Segmentation fault, mpi and gfortran

    Hi. I have this error which only occurs when I try to run my code in a cluster. In my desktop computer I run it in parallel without any problem. [compute-0-2:76201] *** Process received signal *** [compute-0-2:76201] Signal: Segmentation fault (11) [compute-0-2:76201] Signal code: Address not...
  8. FloatingFeather

    Modules in fortran 90

    Ok. I think I'm not doing things so correctly, and my level of understanding of the compilation procedure is just superficial (I know the compiler just translate my Fortran code to machine code, that's all it does for me). I'm not familiar with what you are saying about trees, but I understand...
  9. FloatingFeather

    Modules in fortran 90

    Hi. Thank you again. I'm not familiar with the terminology you are using, sorry for that. I actually don't know what "running the build" means. I compile my code in a linux system using gfortran, basically I type at the command line something like "gfortran mycode.f90 -o mycompiledcode.x". You...
  10. FloatingFeather

    Modules in fortran 90

    Hi. What you mean with running the build twice? Thanks!
  11. FloatingFeather

    Modules in fortran 90

    I am using a subroutine in a code I have written, which used modules. Something weird happens when I compile the code with the module that corresponds to this subroutine. Actually, the whole subroutine is called as a module (it is an open access subroutine). The thing is that, in order to...
  12. FloatingFeather

    Question regarding gfortran compiler

    Nice. Thanks. So, the compiller wouldn't do the job for me?
  13. FloatingFeather

    Question regarding gfortran compiler

    Hi. I have this doubt. I have written a code that integrates by means of the composite Simpsons rule: https://en.wikipedia.org/wiki/Simpson%27s_rule#Composite_Simpson's_rule As can be seen in the wikipedia article, the weights are distributed according to the point where we are evaluating the...
  14. FloatingFeather

    Measuring scalability of MPI program

    Hi. I have been measuring the scalability of some codes I have written in Fortran, and parallelized with MPI. The thing is if I have been doing it properly (the code scales, however, I would expect it to scale a little bit more with the number of processors). I am using an amd ryzen 7 1700 8...
  15. FloatingFeather

    Fortran90 code with FORTRAN77 subroutine

    Thanks a lot for your reply. I have a few more questions. The first one is with respect to the assignment of real*8 g(mxpts) in my code. The question is if, written like that, everytime I call the subroutine it has to make the memory assignment for the array. My intuition is that, due that it...
  16. FloatingFeather

    run parallel program

    It seems that what you are doing is running the same process on the 8 cores. That's why you get 8 times the same output. You should learn some parallel programming in order to parallelize your code, or try with the automatic parallelization that the intel fortran compiler has. You can't get...
  17. FloatingFeather

    Fortran time optimization flags

    The fastest you can get is obtained with the -Ofast flag. That's the best you can do without getting your hands inside the code. Of course, the best would be to parallelize the code, but with only two cores, the best you will get will be to reduce the time by a factor of 2.
  18. FloatingFeather

    Fortran90 code with FORTRAN77 subroutine

    Hi there. I have written a code, as a series of subroutines. The code works properly, it does the job. However, I've written it for the first time in FORTRAN77 (don't ask why, I wish I wouldn't). And then I passed it to Fortran90, because I needed dynamic arrays to work with MPI. However, I...

Part and Inventory Search

Back
Top