Hi, there
On one of our SUN Netra t1 server i get severe error messages and the system fails completely to boot.
BEFORE this happened the Solaris8 system came up normally.
I only added a lmgrd to runlevel 2.
The server has two ordinary SCSI-Disks.
Now the following messages appear and the Netra fails to boot again and again:
WARNING: add_spec: No major number for mpt
(several times)
Cannot assemble drivers for root /pci@1f,0/pc@1,1/scsi@2/disk@0,0:a
Cannot mount root on /pci@1f,0/pc@1,1/scsi@2/disk@0,0:a fstype ufs
panic[cpu0]/thread=10408000: vfs_mountroot: cannot mount root
(followed by several lines including genunix:vfs_mountroot+70 and genunix:main+8c)
skipping system dump - no dump device configured
rebooting...
Resetting ...
I tried swapping the Boot-Disk (which worked) and the system came up. I ran fsck on the corrupt disk but everything seams to be all right. I assume, there is a (simple) configuration error or a missing driver for whatever. The system has no Raid or Logival Volumes set up.
Is there a way to revive the corrupt disk. It`s a new System-disk and already set up and configured.
Many thanks for your help,
lordsun
On one of our SUN Netra t1 server i get severe error messages and the system fails completely to boot.
BEFORE this happened the Solaris8 system came up normally.
I only added a lmgrd to runlevel 2.
The server has two ordinary SCSI-Disks.
Now the following messages appear and the Netra fails to boot again and again:
WARNING: add_spec: No major number for mpt
(several times)
Cannot assemble drivers for root /pci@1f,0/pc@1,1/scsi@2/disk@0,0:a
Cannot mount root on /pci@1f,0/pc@1,1/scsi@2/disk@0,0:a fstype ufs
panic[cpu0]/thread=10408000: vfs_mountroot: cannot mount root
(followed by several lines including genunix:vfs_mountroot+70 and genunix:main+8c)
skipping system dump - no dump device configured
rebooting...
Resetting ...
I tried swapping the Boot-Disk (which worked) and the system came up. I ran fsck on the corrupt disk but everything seams to be all right. I assume, there is a (simple) configuration error or a missing driver for whatever. The system has no Raid or Logival Volumes set up.
Is there a way to revive the corrupt disk. It`s a new System-disk and already set up and configured.
Many thanks for your help,
lordsun