Project

General

Profile

Actions

Bug #20943

closed

Provisioning Setup, Kafo Error, undefined method `keys' for "134.155.91.233 84.226":String (NoMethodError)

Added by mr ch over 6 years ago. Updated over 2 years ago.

Status:
Closed
Priority:
Normal
Assignee:
-
Difficulty:
Triaged:
No
Fixed in Releases:
Found in Releases:

Description

Heey, thx a lot to gwmngilfen and singularity42, the problem i had was quite strange, beside this problem, i never couldn't install foreman in "i" mode straight up, with the many components i needed but had to do a default install before.

Main Issue:
but even after the regular, non "i" install, i went through the great Provisioning Setup Wizzard, generated:

foreman-installer \
--enable-foreman-proxy \
--foreman-proxy-tftp=true \
--foreman-proxy-tftp-servername=10.10.10.1 \
--foreman-proxy-dhcp=true \
--foreman-proxy-dhcp-interface=ens5f0 \
--foreman-proxy-dhcp-gateway= \
--foreman-proxy-dhcp-nameservers="10.10.10.1" \
--foreman-proxy-dns=true \
--foreman-proxy-dns-interface=ens5f0 \
--foreman-proxy-dns-zone=mannheim.bw-cloud.org \
--foreman-proxy-dns-reverse=10.10.10.in-addr.arpa \
--foreman-proxy-dns-forwarders=XXXXXXXX \
--foreman-proxy-foreman-base-url=https://XXXXXXXXXXXXXXXXXXXXXX.org \
--foreman-proxy-oauth-consumer-key=XXXXXXXXXXXXXXXXXXXXXXX \
--foreman-proxy-oauth-consumer-secret=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

ran into it, 2x, on my dev vm and on bare metal:

/usr/share/gems/gems/kafo-2.0.0/lib/kafo/configuration.rb:97:in `modules': undefined method `keys' for "134.155.91.233 84.226":String (NoMethodError)
from /usr/share/gems/gems/kafo-2.0.0/lib/kafo/configuration.rb:207:in `params'
from /usr/share/gems/gems/kafo-2.0.0/lib/kafo/configuration.rb:217:in `preset_defaults_from_puppet'
from /usr/share/gems/gems/kafo-2.0.0/lib/kafo/kafo_configure.rb:285:in `set_parameters'
from /usr/share/gems/gems/kafo-2.0.0/lib/kafo/kafo_configure.rb:100:in `initialize'
from /usr/share/gems/gems/clamp-1.0.0/lib/clamp/command.rb:133:in `new'
from /usr/share/gems/gems/clamp-1.0.0/lib/clamp/command.rb:133:in `run'
from /usr/share/gems/gems/kafo-2.0.0/lib/kafo/kafo_configure.rb:163:in `run'
from /usr/sbin/foreman-installer:8:in `<main>'

(puppet.conf parser is set to "current", i also dont have any - !ruby/string:HighLine::String |- syntax in foreman-answers.yaml)

huge thx in advance

Actions #1

Updated by mr ch over 6 years ago

Icon name: computer-server
Chassis: server
Operating System: CentOS Linux 7 (Core)
CPE OS Name: cpe:/o:centos:centos:7
Kernel: Linux 3.10.0-514.el7.x86_64
Architecture: x86-64

rubygem-kafo_wizards-0.0.1-2.el7.noarch
rubygem-kafo_parsers-0.1.6-1.el7.noarch
rubygem-kafo-2.0.0-1.el7.noarch
puppetlabs-release-pc1-1.1.0-5.el7.noarch
puppet-agent-1.10.6-1.el7.x86_64
puppetserver-2.7.2-1.el7.noarch
puppet-agent-oauth-0.5.1-3.el7.noarch
tfm-rubygem-hammer_cli_foreman-0.10.2-1.el7.noarch
tfm-rubygem-hammer_cli_foreman_openscap-0.1.4-1.fm1_15.el7.noarch
foreman-compute-1.15.3-1.el7.noarch
tfm-rubygem-foreman-tasks-core-0.1.4-1.fm1_15.el7.noarch
tfm-rubygem-foreman_ansible_core-1.1.1-1.fm1_15.el7.noarch
tfm-rubygem-foreman_ansible-1.4.5-1.fm1_15.el7.noarch
tfm-rubygem-foreman_default_hostgroup-4.0.0-1.fm1_13.el7.noarch
tfm-rubygem-foreman_expire_hosts-3.0.0-1.fm1_15.el7.noarch
tfm-rubygem-foreman_openscap-0.7.4-1.fm1_15.el7.noarch
foreman-1.15.3-1.el7.noarch
foreman-release-1.15.3-1.el7.noarch
foreman-release-scl-3-1.el7.noarch
foreman-installer-1.15.3-1.el7.noarch
foreman-proxy-1.15.3-1.el7.noarch
foreman-cli-1.15.3-1.el7.noarch
tfm-rubygem-foreman_setup-5.0.0-1.fm1_13.el7.noarch
foreman-libvirt-1.15.3-1.el7.noarch
foreman-ovirt-1.15.3-1.el7.noarch
tfm-rubygem-foreman-tasks-0.9.4-1.fm1_15.el7.noarch
tfm-rubygem-foreman_cockpit-2.0.3-1.fm1_15.el7.noarch
tfm-rubygem-foreman_dhcp_browser-0.0.7-3.fm1_11.el7.noarch
tfm-rubygem-foreman_discovery-9.1.1-1.fm1_15.el7.noarch
tfm-rubygem-foreman_hooks-0.3.14-1.fm1_15.el7.noarch
tfm-rubygem-foreman_monitoring-0.1.0-1.fm1_15.el7.noarch
tfm-rubygem-foreman_remote_execution-1.3.3-1.fm1_15.el7.noarch
tfm-rubygem-foreman_templates-5.0.1-1.fm1_15.el7.noarch
foreman-debug-1.15.3-1.el7.noarch
foreman-postgresql-1.15.3-1.el7.noarch
foreman-openstack-1.15.3-1.el7.noarch
tfm-rubygem-foreman_bootdisk-9.0.0-1.fm1_15.el7.noarch
tfm-rubygem-foreman_digitalocean-1.2.0-1.fm1_15.el7.noarch
tfm-rubygem-foreman_docker-3.2.1-1.fm1_15.el7.noarch
tfm-rubygem-foreman_host_extra_validator-0.0.4-1.fm1_13.el7.noarch
foreman-selinux-1.15.3-1.el7.noarch
tfm-rubygem-foreman_remote_execution_core-1.0.5-1.fm1_15.el7.noarch

Actions #2

Updated by Ivan Necas over 6 years ago

  • Project changed from Foreman to Kafo
Actions #3

Updated by Ewoud Kohl van Wijngaarden over 6 years ago

While I can't reproduce it, that looks like some IP-detection gone wrong. Can you show me the IP configuration? Or are you passing those in as forwarders?

Actions #4

Updated by mr ch over 6 years ago

Ewoud Kohl van Wijngaarden wrote:

While I can't reproduce it, that looks like some IP-detection gone wrong. Can you show me the IP configuration? Or are you passing those in as forwarders?

[root@XXXXXXXXXXXXXX ~]# ip a
1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN qlen 1
link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
inet 127.0.0.1/8 scope host lo
valid_lft forever preferred_lft forever
inet6 ::1/128 scope host
valid_lft forever preferred_lft forever
2: ens5f0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc mq state UP qlen 1000
link/ether XXXXXXXXXXXXXXXXXXXX brd ff:ff:ff:ff:ff:ff
inet 10.10.10.1/24 brd 10.10.10.255 scope global ens5f0
valid_lft forever preferred_lft forever
inet6 XXXX::XXXXXXXXXXXXXXXXX:5428/64 scope link
valid_lft forever preferred_lft forever
3: ens5f1: <NO-CARRIER,BROADCAST,MULTICAST,UP> mtu 1500 qdisc mq state DOWN qlen 1000
link/ether XXXXXXXXXXXXXXXXXXX brd ff:ff:ff:ff:ff:ff
4: ens4f0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc mq state UP qlen 1000
link/ether XXXXXXXXXXXXXXX brd ff:ff:ff:ff:ff:ff
inet XXXXXXXXXXXXXXXX/23 brd XXXXXXXXXXXXXXX.255 scope global ens4f0
valid_lft forever preferred_lft forever
inet6 XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX/64 scope global mngtmpaddr dynamic
valid_lft 2591901sec preferred_lft 604701sec
inet6 XXXXXXXXXXXXXXXXXXXXXXXXX/64 scope link
valid_lft forever preferred_lft forever
5: ens4f1: <NO-CARRIER,BROADCAST,MULTICAST,UP> mtu 1500 qdisc mq state DOWN qlen 1000
link/ether XXXXXXXXXXXXXXXXX brd ff:ff:ff:ff:ff:ff
You have new mail in /var/spool/mail/root

Actions #5

Updated by Ewoud Kohl van Wijngaarden over 6 years ago

The 134.155.91.233 84.226, is that anywhere in that info or are they your recursors? They look like an IP and a partial IP.

Actions #6

Updated by mr ch over 6 years ago

Ewoud Kohl van Wijngaarden wrote:

The 134.155.91.233 84.226, is that anywhere in that info or are they your recursors? They look like an IP and a partial IP.

no, dont know where they are comming from, the hosts has all the information it needs, on a second run of the foreman-installer with the e.g. parameters, i faced this dhcp error and a new puppet.service running with an connection error:

[root@foreman ~]# foreman-installer \

--enable-foreman-proxy \
--foreman-proxy-tftp=true \
--foreman-proxy-tftp-servername=10.10.10.1 \
--foreman-proxy-dhcp=true \
--foreman-proxy-dhcp-interface=ens5f0 \
--foreman-proxy-dhcp-gateway= \
--foreman-proxy-dhcp-nameservers="10.10.10.1" \
--foreman-proxy-dns=true \
--foreman-proxy-dns-interface=ens5f0 \
--foreman-proxy-dns-zone=XXXXXXXX\
--foreman-proxy-dns-reverse=10.10.10.in-addr.arpa \
--foreman-proxy-dns-forwarders=XXXXXXXX \
--foreman-proxy-dns-forwarders=XXXXXXXX \
--foreman-proxy-foreman-base-url=https://XXXXXXXXXXXXXXXX \
--foreman-proxy-oauth-consumer-key=XXXXXXXX\
--foreman-proxy-oauth-consumer-secret=XXXXXXXX

Systemd start for dhcpd failed!
journalctl log for dhcpd:
-- Logs begin at Fri 2017-09-15 14:10:36 UTC, end at Fri 2017-09-15 14:46:28 UTC. --
Sep 15 14:46:28 XXXXXXXXXXXXXXXXsystemd1: Starting DHCPv4 Server Daemon...
Sep 15 14:46:28 XXXXXXXXXXXXXXXXdhcpd7112: Internet Systems Consortium DHCP Server 4.2.5
Sep 15 14:46:28 XXXXXXXXXXXXXXXXdhcpd7112: Copyright 2004-2013 Internet Systems Consortium.
Sep 15 14:46:28 XXXXXXXXXXXXXXXXdhcpd7112: All rights reserved.
Sep 15 14:46:28 XXXXXXXXXXXXXXXXdhcpd7112: For info, please visit https://www.isc.org/software/dhcp/
Sep 15 14:46:28 XXXXXXXXXXXXXXXXsystemd1: dhcpd.service: main process exited, code=exited, status=1/FAILURE
Sep 15 14:46:28 XXXXXXXXXXXXXXXXsystemd1: Failed to start DHCPv4 Server Daemon.
Sep 15 14:46:28 XXXXXXXXXXXXXXXXsystemd1: Unit dhcpd.service entered failed state.
Sep 15 14:46:28 XXXXXXXXXXXXXXXXsystemd1: dhcpd.service failed.

/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/provider/service/systemd.rb:166:in `rescue in start'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/provider/service/systemd.rb:163:in `start'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/type/service.rb:103:in `block (3 levels) in <module:Puppet>'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/property.rb:487:in `set'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/property.rb:561:in `sync'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/type/service.rb:114:in `sync'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/resource_harness.rb:236:in `sync'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/resource_harness.rb:134:in `sync_if_needed'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/resource_harness.rb:80:in `perform_changes'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/resource_harness.rb:21:in `evaluate'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:230:in `apply'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:246:in `eval_resource'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:163:in `call'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:163:in `block (2 levels) in evaluate'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:507:in `block in thinmark'
/opt/puppetlabs/puppet/lib/ruby/2.1.0/benchmark.rb:294:in `realtime'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:506:in `thinmark'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:163:in `block in evaluate'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/graph/relationship_graph.rb:118:in `traverse'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:154:in `evaluate'
/usr/share/gems/gems/kafo-2.0.0/modules/kafo_configure/lib/puppet/parser/functions/add_progress.rb:30:in `evaluate_with_trigger'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/resource/catalog.rb:222:in `block in apply'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util/log.rb:155:in `with_destination'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/report.rb:142:in `as_logging_destination'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/resource/catalog.rb:221:in `apply'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/configurer.rb:171:in `block in apply_catalog'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:224:in `block in benchmark'
/opt/puppetlabs/puppet/lib/ruby/2.1.0/benchmark.rb:294:in `realtime'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:223:in `benchmark'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/configurer.rb:170:in `apply_catalog'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/configurer.rb:343:in `run_internal'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/configurer.rb:221:in `block in run'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/context.rb:65:in `override'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet.rb:306:in `override'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/configurer.rb:195:in `run'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application/apply.rb:350:in `apply_catalog'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application/apply.rb:274:in `block in main'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/context.rb:65:in `override'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet.rb:306:in `override'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application/apply.rb:225:in `main'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application/apply.rb:170:in `run_command'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application.rb:358:in `block in run'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:662:in `exit_on_fail'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application.rb:358:in `run'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util/command_line.rb:132:in `run'
/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util/command_line.rb:72:in `execute'
/opt/puppetlabs/puppet/bin/puppet:5:in `<main>'
/Stage[main]/Dhcp/Service[dhcpd]/ensure: change from stopped to running failed: Systemd start for dhcpd failed!
journalctl log for dhcpd:
-- Logs begin at Fri 2017-09-15 14:10:36 UTC, end at Fri 2017-09-15 14:46:28 UTC. --
Sep 15 14:46:28 XXXXXXXXXXXXXXXX systemd1: Starting DHCPv4 Server Daemon...
Sep 15 14:46:28 XXXXXXXXXXXXXXXX dhcpd7112: Internet Systems Consortium DHCP Server 4.2.5
Sep 15 14:46:28 XXXXXXXXXXXXXXXX dhcpd7112: Copyright 2004-2013 Internet Systems Consortium.
Sep 15 14:46:28 XXXXXXXXXXXXXXXX dhcpd7112: All rights reserved.
Sep 15 14:46:28 XXXXXXXXXXXXXXXX dhcpd7112: For info, please visit https://www.isc.org/software/dhcp/
Sep 15 14:46:28 XXXXXXXXXXXXXXXX systemd1: dhcpd.service: main process exited, code=exited, status=1/FAILURE
Sep 15 14:46:28 XXXXXXXXXXXXXXXX systemd1: Failed to start DHCPv4 Server Daemon.
Sep 15 14:46:28 XXXXXXXXXXXXXXXX systemd1: Unit dhcpd.service entered failed state.
Sep 15 14:46:28 XXXXXXXXXXXXXXXX systemd1: dhcpd.service failed.
Installing Done [100%] [..........................................................................]
Something went wrong! Check the log for ERROR-level output * Foreman is running at https://XXXXXXXXXXXXXXXXXXXXX
Initial credentials are admin / XXXXXXXXXXXXXXXXXX * Foreman Proxy is running at https://XXXXXXXXXXXXXXXXXXX:8443 * Puppetmaster is running at port 8140
The full log is at /var/log/foreman-installer/foreman.log

● puppet.service - Puppet agent
Loaded: loaded (/usr/lib/systemd/system/puppet.service; enabled; vendor preset: disabled)
Active: active (running) since Fri 2017-09-15 14:10:51 UTC; 41min ago
Main PID: 847 (puppet)
CGroup: /system.slice/puppet.service
└─847 /opt/puppetlabs/puppet/bin/ruby /opt/puppetlabs/puppet/bin/puppet agent --no-daemonize

Sep 15 14:11:14 XXXXXXXXXXXXXXXX puppet-agent1208: Could not send report: Connection refused - connect(2) for "XXXXXXXXXXXXXXXX" port 8140
Sep 15 14:41:08 XXXXXXXXXXXXXXXX puppet-agent5890: Unable to fetch my node definition, but the agent run will continue:
Sep 15 14:41:08 XXXXXXXXXXXXXXXX puppet-agent5890: Connection refused - connect(2) for "XXXXXXXXXXXXXXXX" port 8140
Sep 15 14:41:08 XXXXXXXXXXXXXXXX puppet-agent5890: (/File[/opt/puppetlabs/puppet/cache/facts.d]) Failed to generate additional resources using 'eval_generate': Connection refused - connect(2) for "XXXXXXXXXXXXXXXX" port 8140
Sep 15 14:41:08 XXXXXXXXXXXXXXXX puppet-agent5890: (/File[/opt/puppetlabs/puppet/cache/facts.d]) Could not evaluate: Could not retrieve file metadata for puppet:///pluginfacts: Connection refused - connect(2) for "XXXXXXXXXXXXXXXX" port 8140
Sep 15 14:41:08 XXXXXXXXXXXXXXXX puppet-agent5890: (/File[/opt/puppetlabs/puppet/cache/lib]) Failed to generate additional resources using 'eval_generate': Connection refused - connect(2) for "XXXXXXXXXXXXXXXX" port 8140
Sep 15 14:41:08 XXXXXXXXXXXXXXXX puppet-agent5890: (/File[/opt/puppetlabs/puppet/cache/lib]) Could not evaluate: Could not retrieve file metadata for puppet:///plugins: Connection refused - connect(2) for "XXXXXXXXXXXXXXXX" port 8140
Sep 15 14:41:11 XXXXXXXXXXXXXXXX puppet-agent5890: Could not retrieve catalog from remote server: Connection refused - connect(2) for "XXXXXXXXXXXXXXXX " port 8140
Sep 15 14:41:11 XXXXXXXXXXXXXXXX puppet-agent5890: Applied catalog in 0.27 seconds
Sep 15 14:41:11 XXXXXXXXXXXXXXXX puppet-agent5890: Could not send report: Connection refused - connect(2) for "XXXXXXXXXXXXXXXX " port 8140

FW open:
[root@XXXXXXXXXXXXXXXX var]# curl XXXXXXXXXXXXXXXX:8140
curl: (52) Empty reply from server

Actions #7

Updated by mr ch over 6 years ago

Ewoud Kohl van Wijngaarden wrote:

The 134.155.91.233 84.226, is that anywhere in that info or are they your recursors? They look like an IP and a partial IP.

Solved, after some reboots on 3rd run, reason: on the 2nd the OPTIONAL :/ --foreman-proxy-dhcp-nameservers= was not set, that caused the dhcp issue, note that theres is still no dns running, and i donw know if the dhps need it to function correctly during furteher steps, pxe a.s.o.

Actions #8

Updated by Ewoud Kohl van Wijngaarden over 2 years ago

  • Status changed from New to Closed
Actions

Also available in: Atom PDF