Bug #14696
Unable to create new host
Description
Hi guys
Trying to setup a Katello 3.0 Server, I'm unable to create a new host... It just keeps comming back to the "New host" screen..
And i'm getting the following error:
2016-04-18 16:58:44 [app] [I] Started GET "/hosts/new" for 192.168.60.60 at 2016-04-18 16:58:44 +0200
2016-04-18 16:58:44 [app] [I] Processing by HostsController#new as HTML
2016-04-18 16:58:44 [app] [I] Deface: [WARNING] No :original defined for 'hosts_update_environments_select', you should change its definition to include:
| :original => 'ec4418ddc20ff0c3667c8361538717db7690a13a'
2016-04-18 16:58:44 [app] [I] Rendered hosts/_progress.html.erb (0.8ms)
2016-04-18 16:58:44 [app] [I] Rendered nic/_base_form.html.erb (14.7ms)
2016-04-18 16:58:44 [app] [I] Rendered nic/_virtual_form.html.erb (1.3ms)
2016-04-18 16:58:44 [app] [I] Rendered nic/_provider_specific_form.html.erb (0.6ms)
2016-04-18 16:58:44 [app] [I] Rendered nic/manageds/_managed.html.erb (19.8ms)
2016-04-18 16:58:44 [app] [I] Rendered nic/_base_form.html.erb (10.9ms)
2016-04-18 16:58:44 [app] [I] Rendered nic/_virtual_form.html.erb (0.9ms)
2016-04-18 16:58:44 [app] [I] Rendered nic/_provider_specific_form.html.erb (0.2ms)
2016-04-18 16:58:44 [app] [I] Rendered nic/manageds/_managed.html.erb (14.2ms)
Related issues
Associated revisions
History
#1
Updated by Dominic Cleal over 6 years ago
- Status changed from New to Need more information
There's no error listed, please include the full log from the POST when creating the host.
#2
Updated by Martin Juhl over 6 years ago
Hi
First, when I do press the submit button:
utf8=%E2%9C%93&authenticity_token=awH3BFWPc7qpsH49NDWDpHQUdQN4fN1nqxxkpr%2BtDeM%3D&host%5Bname%5D=Naboo
&host%5Borganization_id%5D=1&host%5Blocation_id%5D=2&host%5Bhostgroup_id%5D=1&host%5Bcompute_resource_id
%5D=1&hostgroup%5Blifecycle_environment_id%5D=4&hostgroup%5Bcontent_view_id%5D=2&host%5Bcontent_facet_attributes
%5D%5Bid%5D=&host%5Benvironment_id%5D=1&host%5Bcontent_source_id%5D=1&hostgroup%5Bpuppetclass_ids%5D
%5B%5D=&host%5Bmanaged%5D=true&host%5Bprogress_report_id%5D=d4f9a5f2-9ccd-4a38-830e-29c930556af6&host
%5Btype%5D=Host%3A%3AManaged&host%5Binterfaces_attributes%5D%5B0%5D%5B_destroy%5D=0&host%5Binterfaces_attributes
%5D%5B0%5D%5Btype%5D=Nic%3A%3AManaged&host%5Binterfaces_attributes%5D%5B0%5D%5Bmac%5D=&host%5Binterfaces_attributes
%5D%5B0%5D%5Bidentifier%5D=&host%5Binterfaces_attributes%5D%5B0%5D%5Bname%5D=Naboo&host%5Binterfaces_attributes
%5D%5B0%5D%5Bdomain_id%5D=1&host%5Binterfaces_attributes%5D%5B0%5D%5Bsubnet_id%5D=1&host%5Binterfaces_attributes
%5D%5B0%5D%5Bip%5D=192.168.60.225&host%5Binterfaces_attributes%5D%5B0%5D%5Bmanaged%5D=0&host%5Binterfaces_attributes
%5D%5B0%5D%5Bprimary%5D=0&host%5Binterfaces_attributes%5D%5B0%5D%5Bprimary%5D=1&host%5Binterfaces_attributes
%5D%5B0%5D%5Bprovision%5D=0&host%5Binterfaces_attributes%5D%5B0%5D%5Bprovision%5D=1&host%5Binterfaces_attributes
%5D%5B0%5D%5Bvirtual%5D=0&host%5Binterfaces_attributes%5D%5B0%5D%5Btag%5D=&host%5Binterfaces_attributes
%5D%5B0%5D%5Battached_to%5D=&host%5Bcompute_attributes%5D%5Bvcpus_max%5D=1&host%5Bcompute_attributes
%5D%5Bmemory_min%5D=1073741824&host%5Bcompute_attributes%5D%5Bmemory_max%5D=1073741824&host%5Bcompute_attributes
%5D%5Bcustom_template_name%5D=&host%5Bcompute_attributes%5D%5Bbuiltin_template_name%5D=&host%5Bcompute_attributes
%5D%5Bxenstore%5D%5Bvm-data%5D%5Bifs%5D%5B0%5D%5Bip%5D=192.168.60.225&host%5Bcompute_attributes%5D%5Bxenstore
%5D%5Bvm-data%5D%5Bifs%5D%5B0%5D%5Bgateway%5D=192.168.60.1&host%5Bcompute_attributes%5D%5Bxenstore%5D
%5Bvm-data%5D%5Bifs%5D%5B0%5D%5Bnetmask%5D=255.255.255.0&host%5Bcompute_attributes%5D%5Bxenstore%5D%5Bvm-data
%5D%5Bnameserver1%5D=192.168.60.117&host%5Bcompute_attributes%5D%5Bxenstore%5D%5Bvm-data%5D%5Bnameserver2
%5D=192.168.60.1&host%5Bcompute_attributes%5D%5Bxenstore%5D%5Bvm-data%5D%5Benvironment%5D=production
&host%5Bcompute_attributes%5D%5BVBDs%5D%5Bsr_uuid%5D=6372568a-d653-9ef7-aa33-333ce68d8e94&host%5Bcompute_attributes
%5D%5BVBDs%5D%5Bphysical_size%5D=20&host%5Bcompute_attributes%5D%5BVIFs%5D%5Bprint%5D=LAN&host%5Bcompute_attributes
%5D%5Bhypervisor_host%5D=&host%5Bcompute_attributes%5D%5Bstart%5D=0&host%5Bcompute_attributes%5D%5Bstart
%5D=1&capabilities=build&provider=Xenserver&host%5Barchitecture_id%5D=1&host%5Boperatingsystem_id%5D
=2&host%5Bprovision_method%5D=build&host%5Bbuild%5D=0&host%5Bbuild%5D=1&host%5Bptable_id%5D=70&host%5Bdisk
%5D=&host%5Broot_pass%5D=&host%5Bis_owned_by%5D=3-Users&host%5Benabled%5D=0&host%5Benabled%5D=1&host
%5Bmodel_id%5D=&host%5Bcomment%5D=&bare_metal_capabilities=build&host%5Boverwrite%5D=false
Then Foreman tells me that Lifecycle Environment and Content View hasn't been selected (even though they have)...
Then, if I select Lifecycle Environment and Content View (again).. My post is:
utf8=%E2%9C%93&authenticity_token=awH3BFWPc7qpsH49NDWDpHQUdQN4fN1nqxxkpr%2BtDeM%3D&host%5Bname%5D=naboo
&host%5Borganization_id%5D=1&host%5Blocation_id%5D=2&host%5Bhostgroup_id%5D=1&host%5Bcompute_resource_id
%5D=1&host%5Bcontent_facet_attributes%5D%5Blifecycle_environment_id%5D=4&host%5Bcontent_facet_attributes
%5D%5Bcontent_view_id%5D=2&host%5Bcontent_facet_attributes%5D%5Bid%5D=&host%5Benvironment_id%5D=1&host
%5Bcontent_source_id%5D=1&host%5Bpuppetclass_ids%5D%5B%5D=&host%5Bmanaged%5D=true&host%5Bprogress_report_id
%5D=d4f9a5f2-9ccd-4a38-830e-29c930556af6&host%5Btype%5D=Host%3A%3AManaged&host%5Binterfaces_attributes
%5D%5B0%5D%5B_destroy%5D=0&host%5Binterfaces_attributes%5D%5B0%5D%5Btype%5D=Nic%3A%3AManaged&host%5Binterfaces_attributes
%5D%5B0%5D%5Bmac%5D=&host%5Binterfaces_attributes%5D%5B0%5D%5Bidentifier%5D=&host%5Binterfaces_attributes
%5D%5B0%5D%5Bname%5D=naboo&host%5Binterfaces_attributes%5D%5B0%5D%5Bdomain_id%5D=1&host%5Binterfaces_attributes
%5D%5B0%5D%5Bsubnet_id%5D=1&host%5Binterfaces_attributes%5D%5B0%5D%5Bip%5D=192.168.60.225&host%5Binterfaces_attributes
%5D%5B0%5D%5Bmanaged%5D=0&host%5Binterfaces_attributes%5D%5B0%5D%5Bprimary%5D=0&host%5Binterfaces_attributes
%5D%5B0%5D%5Bprimary%5D=1&host%5Binterfaces_attributes%5D%5B0%5D%5Bprovision%5D=0&host%5Binterfaces_attributes
%5D%5B0%5D%5Bprovision%5D=1&host%5Binterfaces_attributes%5D%5B0%5D%5Bvirtual%5D=0&host%5Binterfaces_attributes
%5D%5B0%5D%5Btag%5D=&host%5Binterfaces_attributes%5D%5B0%5D%5Battached_to%5D=&host%5Bcompute_attributes
%5D%5Bvcpus_max%5D=1&host%5Bcompute_attributes%5D%5Bmemory_min%5D=268435456&host%5Bcompute_attributes
%5D%5Bmemory_max%5D=268435456&host%5Bcompute_attributes%5D%5Bcustom_template_name%5D=&host%5Bcompute_attributes
%5D%5Bbuiltin_template_name%5D=&host%5Bcompute_attributes%5D%5Bxenstore%5D%5Bvm-data%5D%5Bifs%5D%5B0
%5D%5Bip%5D=192.168.60.225&host%5Bcompute_attributes%5D%5Bxenstore%5D%5Bvm-data%5D%5Bifs%5D%5B0%5D%5Bgateway
%5D=192.168.60.1&host%5Bcompute_attributes%5D%5Bxenstore%5D%5Bvm-data%5D%5Bifs%5D%5B0%5D%5Bnetmask%5D
=255.255.255.0&host%5Bcompute_attributes%5D%5Bxenstore%5D%5Bvm-data%5D%5Bnameserver1%5D=192.168.60.117
&host%5Bcompute_attributes%5D%5Bxenstore%5D%5Bvm-data%5D%5Bnameserver2%5D=192.168.60.1&host%5Bcompute_attributes
%5D%5Bxenstore%5D%5Bvm-data%5D%5Benvironment%5D=production&host%5Bcompute_attributes%5D%5BVBDs%5D%5Bsr_uuid
%5D=6372568a-d653-9ef7-aa33-333ce68d8e94&host%5Bcompute_attributes%5D%5BVBDs%5D%5Bphysical_size%5D=&host
%5Bcompute_attributes%5D%5BVIFs%5D%5Bprint%5D=Host+internal+management+network&host%5Bcompute_attributes
%5D%5Bhypervisor_host%5D=&host%5Bcompute_attributes%5D%5Bstart%5D=0&host%5Bcompute_attributes%5D%5Bstart
%5D=1&capabilities=build&provider=Xenserver&host%5Barchitecture_id%5D=1&host%5Boperatingsystem_id%5D
=2&host%5Bprovision_method%5D=build&host%5Bbuild%5D=0&host%5Bbuild%5D=1&host%5Bptable_id%5D=70&host%5Bdisk
%5D=&host%5Broot_pass%5D=&host%5Bis_owned_by%5D=3-Users&host%5Benabled%5D=0&host%5Benabled%5D=1&host
%5Bcomment%5D=&bare_metal_capabilities=build&host%5Boverwrite%5D=false
And the "New Host" page just refreshes, but does nothing...
I get the response:
<div id="tasks_progress">
<p><span class="glyphicon glyphicon-remove "></span> failed - Set up compute instance naboo.outerrim
.lan</p>
<p><span class="glyphicon glyphicon-th "></span> pending - Query instance details for naboo.outerrim
.lan</p>
<p><span class="glyphicon glyphicon-th "></span> pending - Power up compute instance naboo.outerrim
.lan</p>
</div>
#3
Updated by Dominic Cleal over 6 years ago
- Is duplicate of Bug #14166: empty facet attributes not handled - "Creating a 'new host' fails for lifecycle_env and content_view saying "can't be blank" added
#4
Updated by Dominic Cleal over 6 years ago
- Status changed from Need more information to Duplicate
Thanks for the report, this sounds like bug #14166 which is scheduled for 1.11.1, planned for this week.
#5
Updated by Martin Juhl over 6 years ago
Ok..
Thanks..
I will test when 1.11.1 is available..
are there release candidate packages available???
#6
Updated by Dominic Cleal over 6 years ago
No, we don't publish RCs for point releases at the moment. There are nightly packages, but you wouldn't be able to upgrade safely from a nightly "back" to 1.11.1 if you used them.
#7
Updated by Martin Juhl over 6 years ago
Ok.. I will just wait for the release..
Thanks again...
#8
Updated by Martin Juhl over 6 years ago
Hi
I have just updated to 1.11.1...
I still have the same issue.. the two fields are still cleared, but now they are not marked with red....
Logs says "Failed to Save"
2016-04-20 16:47:48 [app] [I] Parameters: {"utf8"=>"✓", "authenticity_token"=>"AKeltagO/PUS1elMjayOKxAq+ZMA89Y2kC7LTnuCBeU=", "host"=>{"name"=>"naboo", "organization_id"=>"1", "location_id"=>"2", "hostgroup_id"=>"1", "compute_resource_id"=>"1", "content_facet_attributes"=>{"lifecycle_environment_id"=>"4", "content_view_id"=>"2", "id"=>""}, "environment_id"=>"1", "content_source_id"=>"1", "puppetclass_ids"=>[""], "managed"=>"true", "progress_report_id"=>"[FILTERED]", "type"=>"Host::Managed", "interfaces_attributes"=>{"0"=>{"_destroy"=>"0", "type"=>"Nic::Managed", "mac"=>"", "identifier"=>"", "name"=>"naboo", "domain_id"=>"1", "subnet_id"=>"1", "ip"=>"192.168.60.119", "managed"=>"0", "primary"=>"1", "provision"=>"1", "virtual"=>"0", "tag"=>"", "attached_to"=>""}}, "compute_attributes"=>{"vcpus_max"=>"1", "memory_min"=>"1073741824", "memory_max"=>"1073741824", "custom_template_name"=>"", "builtin_template_name"=>"", "xenstore"=>{"vm-data"=>{"ifs"=>{"0"=>{"ip"=>"192.168.60.119", "gateway"=>"192.168.60.1", "netmask"=>"255.255.255.0"}}, "nameserver1"=>"192.168.60.117", "nameserver2"=>"192.168.60.1", "environment"=>"production"}}, "VBDs"=>{"sr_uuid"=>"6372568a-d653-9ef7-aa33-333ce68d8e94", "physical_size"=>"20"}, "VIFs"=>{"print"=>"LAN"}, "hypervisor_host"=>"", "start"=>"1"}, "architecture_id"=>"1", "operatingsystem_id"=>"2", "provision_method"=>"build", "build"=>"1", "ptable_id"=>"70", "disk"=>"", "root_pass"=>"[FILTERED]", "is_owned_by"=>"3-Users", "enabled"=>"1", "comment"=>"", "overwrite"=>"false"}, "capabilities"=>"build", "provider"=>"Xenserver", "bare_metal_capabilities"=>"build"}
2016-04-20 16:47:48 [app] [I] Failed to save:
2016-04-20 16:47:48 [app] [I] Rendered hosts/_progress.html.erb (0.2ms)
2016-04-20 16:47:48 [app] [I] Rendered puppetclasses/_selectedClasses.html.erb (0.0ms)
2016-04-20 16:47:48 [app] [I] Rendered puppetclasses/_classes_in_groups.html.erb (0.0ms)
2016-04-20 16:47:48 [app] [I] Rendered puppetclasses/_classes.html.erb (0.1ms)
2016-04-20 16:47:48 [app] [I] Rendered puppetclasses/_class_selection.html.erb (39.8ms)
2016-04-20 16:47:48 [app] [I] Rendered nic/_base_form.html.erb (17.6ms)
2016-04-20 16:47:48 [app] [I] Rendered nic/_virtual_form.html.erb (1.1ms)
2016-04-20 16:47:48 [app] [I] Rendered nic/_provider_specific_form.html.erb (0.3ms)
2016-04-20 16:47:48 [app] [I] Rendered nic/manageds/_managed.html.erb (20.3ms)
2016-04-20 16:47:48 [app] [I] Rendered nic/_base_form.html.erb (14.7ms)
2016-04-20 16:47:48 [app] [I] Rendered nic/_virtual_form.html.erb (0.9ms)
2016-04-20 16:47:48 [app] [I] Rendered nic/_provider_specific_form.html.erb (0.3ms)
2016-04-20 16:47:48 [app] [I] Rendered nic/manageds/_managed.html.erb (17.1ms)
2016-04-20 16:47:48 [app] [I] Rendered hosts/_interfaces.html.erb (39.2ms)
#9
Updated by Dominic Cleal over 6 years ago
- Is duplicate of deleted (Bug #14166: empty facet attributes not handled - "Creating a 'new host' fails for lifecycle_env and content_view saying "can't be blank" )
#10
Updated by Dominic Cleal over 6 years ago
- Related to Bug #14166: empty facet attributes not handled - "Creating a 'new host' fails for lifecycle_env and content_view saying "can't be blank" added
#11
Updated by Dominic Cleal over 6 years ago
- Project changed from Foreman to Katello
- Status changed from Duplicate to New
I'll move this to Katello as the fields are added by that plugin, they might be able to debug it.
#12
Updated by Eric Helms over 6 years ago
- Assignee set to Justin Sherrill
- Priority changed from Normal to High
- Legacy Backlogs Release (now unused) set to 86
#13
Updated by Martin Juhl over 6 years ago
Is anymore info needed???
#14
Updated by Martin Juhl about 6 years ago
Did some testing today, together with gwmngilfen and jsherrill..
If using the bare metal provision, Foreman seems to work fine.. I haven't got another hypervisor around to try, but it seems like the issue is related to the xenserver plugin...
#15
Updated by Greg Sutcliffe about 6 years ago
- Project changed from Katello to Xen
- Assignee deleted (
Justin Sherrill)
A pastebin from IRC: http://pastebin.com/ttcEQPb9
Line 36 seems relevant, the host fails to save but doesn't say why.
#17
Updated by Anonymous about 6 years ago
- Has duplicate Bug #15082: Cant create host added
#18
Updated by Alejandro Falcon about 6 years ago
I'm having the same issue when creating new xen based hosts in my foreman test environment. Foreman 1.11.2. Not using katello.
#19
Updated by Anonymous about 6 years ago
Alejandro, could you test if the changes in https://github.com/theforeman/foreman-xen/pull/40 fix this issue also, by any chance?
#20
Updated by Anonymous about 6 years ago
- Related to deleted (Bug #14166: empty facet attributes not handled - "Creating a 'new host' fails for lifecycle_env and content_view saying "can't be blank" )
#21
Updated by Alejandro Falcon about 6 years ago
Sorry I forgot to mention that. First thing I tried was that pull request but still having the problem. I'm happy to test anything else just let me know.
#22
Updated by Anonymous about 6 years ago
what's probably needed would be a development setup to step through the code with a debugger while creating a host to see, what's happening...
#23
Updated by T Wening about 6 years ago
i experienced the same issue, but i didn't get any error. Even if I activate the debug log.
#24
Updated by Alejandro Falcon about 6 years ago
Another interesting thing I discovered is that instead of creating VMs, every time you try to create a new host and fails a new xen template is created instead.
#25
Updated by Alejandro Falcon about 6 years ago
I was able to workaround the issue by replacing new fog and xen gems by old ones(installed by foreman 1.10). I think its an issue with that fog-xenserver change separated from fog core. Bellow are the logs xen creates in both cases. Hope it helps.
GOOD:
Jun 17 12:25:50 vmhost01 xapi: [ info|vmhost01|12278066 INET 0.0.0.0:80|session.login_with_password D:...|xapi] Session.create trackid=5abbb1e23d5106db4033e21b28dac0b0 pool=false uname=root originator= is_local_superuser=true auth_user_sid= parent=trackid=9834f5af41c964e225f24279aefe4e49
Jun 17 12:25:52 vmhost01 xapi: [ info|vmhost01|12278066 INET 0.0.0.0:80|VDI.create R:6e622f5aa103|storage_impl] VDI.create dbg:OpaqueRef:6e622f5a-a103-7a0b-da7a-dc341e46fdff sr:0c6e8aee-083a-6a74-9aec-6e83b5be783b vdi_info:{"vdi": "", "content_id": "", "name_label": "test1.sng-disk1", "name_description": "test1.sng-disk_1", "ty": "system", "metadata_of_pool": "", "is_a_snapshot": false, "snapshot_time": "19700101T00:00:00Z", "snapshot_of": "", "read_only": false, "virtual_size": 21474836480, "physical_utilisation": 0, "persistent": true, "sm_config": {}}
Jun 17 12:25:52 vmhost01 xapi: [ info|vmhost01|12278066 INET 0.0.0.0:80|sm_exec D:1f8dcebec9ba|xapi] Session.create trackid=af19b5aa59145c56a294d1d60a31f811 pool=false uname= originator= is_local_superuser=true auth_user_sid= parent=trackid=9834f5af41c964e225f24279aefe4e49
Jun 17 12:25:52 vmhost01 xapi: [ info|vmhost01|12278073 UNIX /var/xapi/xapi|dispatch:VDI.db_introduce D:207d3b1e74a5|taskhelper] task VDI.db_introduce R:3330c6ede0d8 (uuid:f3d3103f-b17f-8876-dc03-9899b96e829f) created (trackid=af19b5aa59145c56a294d1d60a31f811) by task D:9c945f5ba108
Jun 17 12:25:52 vmhost01 xapi: [ info|vmhost01|12278066 INET 0.0.0.0:80|sm_exec D:1f8dcebec9ba|xapi] Session.destroy trackid=af19b5aa59145c56a294d1d60a31f811
Jun 17 12:25:57 vmhost01 xapi: [ info|vmhost01|12278079 UNIX /var/xapi/xapi|session.slave_login D:5ae48312023c|xapi] Session.create trackid=45368880bae2e38bfeb067feda20bd7c pool=true uname= originator= is_local_superuser=true auth_user_sid= parent=trackid=9834f5af41c964e225f24279aefe4e49
Jun 17 12:25:57 vmhost01 xapi: [ info|vmhost01|12278082 UNIX /var/xapi/xapi|session.logout D:415be54c4f0e|xapi] Session.destroy trackid=45368880bae2e38bfeb067feda20bd7c
Jun 17 12:25:57 vmhost01 xapi: [ warn|vmhost01|12278066 INET 0.0.0.0:80|VM.provision R:419c0fc08a2f|xapi] VM test1.sng could run on any of these hosts: [ compute11; compute12 ]
Jun 17 12:25:57 vmhost01 xapi: [ info|vmhost01|12278083 UNIX /var/xapi/xapi|session.slave_login D:91a4cf365c55|xapi] Session.create trackid=8d45ddbe5f98a63e4bba6fb11b616b2a pool=true uname= originator= is_local_superuser=true auth_user_sid= parent=trackid=9834f5af41c964e225f24279aefe4e49
Jun 17 12:25:57 vmhost01 xapi: [ info|vmhost01|12278066 INET 0.0.0.0:80|VM.provision R:419c0fc08a2f|xapi] VM.set_is_a_template('false')
Jun 17 12:25:57 vmhost01 xapi: [ info|vmhost01|12278088 UNIX /var/xapi/xapi|session.logout D:0892d6b66134|xapi] Session.destroy trackid=8d45ddbe5f98a63e4bba6fb11b616b2a
BAD:
Jun 17 12:32:21 vmhost01 xapi: [ info|vmhost01|12278561 INET 0.0.0.0:80|session.login_with_password D:...|xapi] Session.create trackid=cedf89e0767206e17337338843ef7f6a pool=false uname=root originator= is_local_superuser=true auth_user_sid= parent=trackid=9834f5af41c964e225f24279aefe4e49
Jun 17 12:32:21 vmhost01 xapi: [ info|vmhost01|12278561 INET 0.0.0.0:80|VDI.create R:8689ac053d94|storage_impl] VDI.create dbg:OpaqueRef:8689ac05-3d94-1b97-e5ed-3da3fdc771c0 sr:0c6e8aee-083a-6a74-9aec-6e83b5be783b vdi_info:{"vdi": "", "content_id": "", "name_label": "test1.tst-disk1", "name_description": "test1.tst-disk_1", "ty": "system", "metadata_of_pool": "", "is_a_snapshot": false, "snapshot_time": "19700101T00:00:00Z", "snapshot_of": "", "read_only": false, "virtual_size": 21474836480, "physical_utilisation": 0, "persistent": true, "sm_config": {}}
Jun 17 12:32:21 vmhost01 xapi: [ info|vmhost01|12278561 INET 0.0.0.0:80|sm_exec D:dc5e8315a0cb|xapi] Session.create trackid=c5f3f9529d7e7c805bc916fa98cd8dbb pool=false uname= originator= is_local_superuser=true auth_user_sid= parent=trackid=9834f5af41c964e225f24279aefe4e49
Jun 17 12:32:21 vmhost01 xapi: [ info|vmhost01|12278567 UNIX /var/xapi/xapi|dispatch:VDI.db_introduce D:799e7272a60d|taskhelper] task VDI.db_introduce R:99a9f31e87f8 (uuid:30ae95cd-6dae-4b4b-1667-1b115de22023) created (trackid=c5f3f9529d7e7c805bc916fa98cd8dbb) by task D:45d50a4bfdde
Jun 17 12:32:21 vmhost01 xapi: [ info|vmhost01|12278561 INET 0.0.0.0:80|sm_exec D:dc5e8315a0cb|xapi] Session.destroy trackid=c5f3f9529d7e7c805bc916fa98cd8dbb
#26
Updated by Anonymous about 6 years ago
#27
Updated by Ivan G about 6 years ago
Same issue here,
Failed to save:
and the remainder looks similar. Let me know if you need to see my logs - I have fresh install of foreman and xenserver plugin.
#28
Updated by Alejandro Falcon almost 6 years ago
- File unabletosave.PNG unabletosave.PNG added
I've forked foreman-xen and made some changes here: https://github.com/alejandrocfg/foreman-xen/commit/4e06c89a273b8f8a9c3e9d1dc952dfd42bea8344 that allows VMs to be created. The issue now is on foreman side when checking for the NIC:
2016-08-22 22:10:11 [app] [I] Failed to save: Could not find virtual machine network interface matching 10.23.250.28
#29
Updated by Alejandro Falcon almost 6 years ago
The VM gets created properly but foreman rolls it back after the nic matching failure.
#30
Updated by Alejandro Falcon almost 6 years ago
Finally, I was able to fix the issue in my test environment. Created a pull request with the necessary changes: https://github.com/theforeman/foreman-xen/pull/43
#31
Updated by Anonymous almost 6 years ago
- Assignee changed from Operations ooVoo to Anonymous
- Status changed from New to Ready For Testing
- Pull request https://github.com/theforeman/foreman-xen/pull/43 added
#32
Updated by Anonymous almost 6 years ago
- Status changed from Ready For Testing to Closed
- % Done changed from 0 to 100
Applied in changeset foreman-xen|9e5f80de37372be3a701b7635e29eb4a353450d8.
#33
Updated by Anonymous almost 6 years ago
0.3.1 of the xen plugin should be in the YUM/APT repos now, containing this fix.
#34
Updated by Martin Juhl almost 6 years ago
Which repository are the plugin released to??
#35
Updated by Anonymous almost 6 years ago
our YUM/APT repos and also rubygems.org.
#36
Updated by Martin Juhl almost 6 years ago
#37
Updated by Anonymous almost 6 years ago
OK, it's different with YUM, per Dominic:
<mmoll> Dominic: are there any tests for plugins before they show up in the yum repo? <Dominic> yes, repoclosure runs daily at around 5pm UTC
#38
Updated by Greg Sutcliffe almost 4 years ago
- Target version deleted (
Katello 3.0.0)
fixes #14696 - adjustments for fog-xenserver