mt100eocg unit spontaneously resets to default network configuration
Tagged: /var/config, boot, configuration, mt100eocg
- This topic has 2 replies, 2 voices, and was last updated 8 years, 11 months ago by Darrik Spaude.
-
AuthorPosts
-
November 9, 2015 at 12:13 pm #9861Klaus Holst JacobsenParticipant
Hello
We have a mt100eocg running with our own config files (ppp and network configuration files).These are all located in the proper places under /var/config.
It all runs properly for a long period of time and for many reboots. but suddenly our config files sometimes disappear and the unit e.g. reverts back to its default IP address.Is this a known problem or are we maybe doing something wrong?
Is there some way we can investigate this to find out what really happens?I’ve read that a way to make a unit revert back to default is to do “rm -fr /var/config/*”. But we’re not doing that!
Regards
KlausDecember 14, 2015 at 4:28 am #10424Klaus Holst JacobsenParticipantHello Multitech.
This post is a month old, how long response time can I expect….if any?
I found out some more.
It seems like mounting /var/config fail causing the /etc/init.d/config script to recreate the mtd6 partition and unzip defaults to it.
See below. But, the real question is why does the mount fail in the first place.
I have seen this happen on multiple unit, so it is not a single incident.Mounting /var/config mount: page allocation failure. order:4, mode:0xd0 [<c002e190>] (unwind_backtrace+0x0/0xe0) from [<c0076c4c>] (__alloc_pages_nodemask+0x4dc/0x534) [<c0076c4c>] (__alloc_pages_nodemask+0x4dc/0x534) from [<c0092428>] (cache_alloc_refill+0x2c8/0x5cc) [<c0092428>] (cache_alloc_refill+0x2c8/0x5cc) from [<c00927bc>] (__kmalloc+0x90/0xec) [<c00927bc>] (__kmalloc+0x90/0xec) from [<c016b7a8>] (jffs2_sum_init+0x64/0xc4) [<c016b7a8>] (jffs2_sum_init+0x64/0xc4) from [<c0162da8>] (jffs2_do_mount_fs+0x168/0x54c) [<c0162da8>] (jffs2_do_mount_fs+0x168/0x54c) from [<c0165168>] (jffs2_do_fill_super+0x130/0x224) [<c0165168>] (jffs2_do_fill_super+0x130/0x224) from [<c01d4c10>] (get_sb_mtd_aux.clone.0+0x54/0xb8) [<c01d4c10>] (get_sb_mtd_aux.clone.0+0x54/0xb8) from [<c01d4d84>] (get_sb_mtd+0x110/0x140) [<c01d4d84>] (get_sb_mtd+0x110/0x140) from [<c01653ec>] (jffs2_get_sb+0x18/0x20) [<c01653ec>] (jffs2_get_sb+0x18/0x20) from [<c0096578>] (vfs_kern_mount+0x50/0x114) [<c0096578>] (vfs_kern_mount+0x50/0x114) from [<c0096680>] (do_kern_mount+0x34/0xdc) [<c0096680>] (do_kern_mount+0x34/0xdc) from [<c00adad4>] (do_mount+0x6ec/0x780) [<c00adad4>] (do_mount+0x6ec/0x780) from [<c00addd4>] (sys_mount+0x84/0xc4) [<c00addd4>] (sys_mount+0x84/0xc4) from [<c0029e80>] (ret_fast_syscall+0x0/0x2c) Mem-info: Normal per-cpu: CPU 0: hi: 18, btch: 3 usd: 0 active_anon:539 inactive_anon:591 isolated_anon:0 active_file:6826 inactive_file:4748 isolated_file:0 unevictable:0 dirty:2 writeback:0 unstable:0 free:414 slab_reclaimable:330 slab_unreclaimable:1093 mapped:233 shmem:39 pagetables:120 bounce:0 Normal free:1656kB min:1016kB low:1268kB high:1524kB active_anon:2156kB inactive_anon:2364kB active_file:27304kB inactive_file:18992kB unevictable:0kB isolated(anon):0kB isolated(file):0kB present:65024kB mlocked:0kB dirty:8kB writeback:0kB mapped:932kB shmem:156kB slab_reclaimable:1320kB slab_unreclaimable:4372kB kernel_stack:584kB pagetables:480kB unstable:0kB bounce:0kB writeback_tmp:0kB pages_scanned:0 all_unreclaimable? no lowmem_reserve[]: 0 0 Normal: 10*4kB 60*8kB 49*16kB 7*32kB 2*64kB 0*128kB 0*256kB 0*512kB 0*1024kB 0*2048kB 0*4096kB = 1656kB 11613 total pagecache pages 16384 pages of RAM 607 free pages 1094 reserved pages 1423 slab pages 16158 pages shared 0 pages swap cached JFFS2 warning: (257) jffs2_sum_init: Can't allocate buffer for writing out summary information! mount: Cannot allocate memory Creating /var/config flash_eraseall has been replaced by <code>flash_erase <mtddev> 0 0</code>; please use it Erasing 128 Kibyte @ 0 -- 0 % complete flash_erase: Cleanmarker written at 0 Erasing 128 Kibyte @ 20000 -- 1 % complete flash_erase: Cleanmarker written at 20000 Erasing 128 Kibyte @ 40000 -- 3 % complete flash_erase: Cleanmarker written at 40000 Erasing 128 Kibyte @ 60000 -- 4 % complete flash_erase: Cleanmarker written at 60000 Erasing 128 Kibyte @ 80000 -- 6 % complete flash_erase: Cleanmarker written at 80000 Erasing 128 Kibyte @ a0000 -- 7 % complete flash_erase: Cleanmarker written at a0000 Erasing 128 Kibyte @ c0000 -- 9 % complete flash_erase: Cleanmarker written at c0000 Erasing 128 Kibyte @ e0000 -- 10 % complete flash_erase: Cleanmarker written at e0000 Erasing 128 Kibyte @ 100000 -- 12 % complete flash_erase: Cleanmarker written at 100000 Erasing 128 Kibyte @ 120000 -- 14 % complete flash_erase: Cleanmarker written at 120000 Erasing 128 Kibyte @ 140000 -- 15 % complete flash_erase: Cleanmarker written at 140000 Erasing 128 Kibyte @ 160000 -- 17 % complete flash_erase: Cleanmarker written at 160000 Erasing 128 Kibyte @ 180000 -- 18 % complete flash_erase: Cleanmarker written at 180000 Erasing 128 Kibyte @ 1a0000 -- 20 % complete flash_erase: Cleanmarker written at 1a0000 Erasing 128 Kibyte @ 1c0000 -- 21 % complete flash_erase: Cleanmarker written at 1c0000 Erasing 128 Kibyte @ 1e0000 -- 23 % complete flash_erase: Cleanmarker written at 1e0000 Erasing 128 Kibyte @ 200000 -- 25 % complete flash_erase: Cleanmarker written at 200000 Erasing 128 Kibyte @ 220000 -- 26 % complete flash_erase: Cleanmarker written at 220000 Erasing 128 Kibyte @ 240000 -- 28 % complete flash_erase: Cleanmarker written at 240000 Erasing 128 Kibyte @ 260000 -- 29 % complete flash_erase: Cleanmarker written at 260000 Erasing 128 Kibyte @ 280000 -- 31 % complete flash_erase: Cleanmarker written at 280000 Erasing 128 Kibyte @ 2a0000 -- 32 % complete flash_erase: Cleanmarker written at 2a0000 Erasing 128 Kibyte @ 2c0000 -- 34 % complete flash_erase: Cleanmarker written at 2c0000 Erasing 128 Kibyte @ 2e0000 -- 35 % complete flash_erase: Cleanmarker written at 2e0000 Erasing 128 Kibyte @ 300000 -- 37 % complete flash_erase: Cleanmarker written at 300000 Erasing 128 Kibyte @ 320000 -- 39 % complete flash_erase: Cleanmarker written at 320000 Erasing 128 Kibyte @ 340000 -- 40 % complete flash_erase: Cleanmarker written at 340000 Erasing 128 Kibyte @ 360000 -- 42 % complete flash_erase: Cleanmarker written at 360000 Erasing 128 Kibyte @ 380000 -- 43 % complete flash_erase: Cleanmarker written at 380000 Erasing 128 Kibyte @ 3a0000 -- 45 % complete flash_erase: Cleanmarker written at 3a0000 Erasing 128 Kibyte @ 3c0000 -- 46 % complete flash_erase: Cleanmarker written at 3c0000 Erasing 128 Kibyte @ 3e0000 -- 48 % complete flash_erase: Cleanmarker written at 3e0000 Erasing 128 Kibyte @ 400000 -- 50 % complete flash_erase: Cleanmarker written at 400000 Erasing 128 Kibyte @ 420000 -- 51 % complete flash_erase: Cleanmarker written at 420000 Erasing 128 Kibyte @ 440000 -- 53 % complete flash_erase: Cleanmarker written at 440000 Erasing 128 Kibyte @ 460000 -- 54 % complete flash_erase: Cleanmarker written at 460000 Erasing 128 Kibyte @ 480000 -- 56 % complete flash_erase: Cleanmarker written at 480000 Erasing 128 Kibyte @ 4a0000 -- 57 % complete flash_erase: Cleanmarker written at 4a0000 Erasing 128 Kibyte @ 4c0000 -- 59 % complete flash_erase: Cleanmarker written at 4c0000 Erasing 128 Kibyte @ 4e0000 -- 60 % complete flash_erase: Cleanmarker written at 4e0000 Erasing 128 Kibyte @ 500000 -- 62 % complete flash_erase: Cleanmarker written at 500000 Erasing 128 Kibyte @ 520000 -- 64 % complete flash_erase: Cleanmarker written at 520000 Erasing 128 Kibyte @ 540000 -- 65 % complete flash_erase: Cleanmarker written at 540000 Erasing 128 Kibyte @ 560000 -- 67 % complete flash_erase: Cleanmarker written at 560000 Erasing 128 Kibyte @ 580000 -- 68 % complete flash_erase: Cleanmarker written at 580000 Erasing 128 Kibyte @ 5a0000 -- 70 % complete flash_erase: Cleanmarker written at 5a0000 Erasing 128 Kibyte @ 5c0000 -- 71 % complete flash_erase: Cleanmarker written at 5c0000 Erasing 128 Kibyte @ 5e0000 -- 73 % complete flash_erase: Cleanmarker written at 5e0000 Erasing 128 Kibyte @ 600000 -- 75 % complete flash_erase: Cleanmarker written at 600000 Erasing 128 Kibyte @ 620000 -- 76 % complete flash_erase: Cleanmarker written at 620000 Erasing 128 Kibyte @ 640000 -- 78 % complete flash_erase: Cleanmarker written at 640000 Erasing 128 Kibyte @ 660000 -- 79 % complete flash_erase: Cleanmarker written at 660000 Erasing 128 Kibyte @ 680000 -- 81 % complete flash_erase: Cleanmarker written at 680000 Erasing 128 Kibyte @ 6a0000 -- 82 % complete flash_erase: Cleanmarker written at 6a0000 Erasing 128 Kibyte @ 6c0000 -- 84 % complete flash_erase: Cleanmarker written at 6c0000 Erasing 128 Kibyte @ 6e0000 -- 85 % complete flash_JFFS2 notice: (259) jffs2_build_xattr_subsystem: complete building xattr subsystem, 0 of xdatum (0 unchecked, 0 orphan) and 0 of xref (0 dead, 0 orphan) found. erase: Cleanmarker written at 6e0000 Erasing 128 Kibyte @ 700000 -- 87 % complete flash_erase: Cleanmarker written at 700000 Erasing 128 Kibyte @ 720000 -- 89 % complete flash_erase: Cleanmarker written at 720000 Erasing 128 Kibyte @ 740000 -- 90 % complete flash_erase: Cleanmarker written at 740000 Erasing 128 Kibyte @ 760000 -- 92 % complete flash_erase: Cleanmarker written at 760000 Erasing 128 Kibyte @ 780000 -- 93 % complete flash_erase: Cleanmarker written at 780000 Erasing 128 Kibyte @ 7a0000 -- 95 % complete flash_erase: Cleanmarker written at 7a0000 Erasing 128 Kibyte @ 7c0000 -- 96 % complete flash_erase: Cleanmarker written at 7c0000 Erasing 128 Kibyte @ 7e0000 -- 98 % complete flash_erase: Cleanmarker written at 7e0000 Erasing 128 Kibyte @ 7e0000 -- 100 % complete network/ network/interfaces ppp/ ppp/chap-secrets ppp/options ppp/pap-secrets ppp/peers/ ppp/peers/gsm ppp/peers/cdma_chat ppp/peers/gsm_chat ppp/peers/cdma passwd group
Regards
KlausDecember 28, 2015 at 3:41 pm #10647Darrik SpaudeKeymasterKlaus created a support case, but unfortunately we have not seen this problem and we are not sure what is causing it. Klaus wondered if the recommended solution [is] to add a bitbake append recipe to the “config” recipe in order to ensure the defaults corresponds to whatever settings we would like. Our response was, yes, try that.
-
AuthorPosts
- You must be logged in to reply to this topic.