Skip to content

apache flink can not install #136

@simonj217

Description

@simonj217

hi @lucasbak

I was installing Apache Flink on Ambari, and it’s throwing an error. I’m sending you the related log for your review
Ambari version: 1.4.6 on RHEL 9

ODP version: ODP‑1.3.1.0 with JDK 17


stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/ODP/1.0/services/FLINK/package/scripts/flink_client.py", line 55, in
SparkClient().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 350, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/ODP/1.0/services/FLINK/package/scripts/flink_client.py", line 35, in install
self.configure(env)
File "/var/lib/ambari-agent/cache/stacks/ODP/1.0/services/FLINK/package/scripts/flink_client.py", line 38, in configure
import params
File "/var/lib/ambari-agent/cache/stacks/ODP/1.0/services/FLINK/package/scripts/params.py", line 260, in
if java_version >= 17:
NameError: name 'java_version' is not defined
stdout:
2025-11-18 19:50:26,452 - Stack Feature Version Info: Cluster Stack=1.3, Command Stack=None, Command Version=None -> 1.3
2025-11-18 19:50:26,453 - Using hadoop conf dir: /usr/odp/1.3.1.0-174/hadoop/conf
2025-11-18 19:50:26,454 - Skipping param: datanode_max_locked_memory, due to Configuration parameter 'dfs.datanode.max.locked.memory' was not found in configurations dictionary!
2025-11-18 19:50:26,454 - Skipping param: dfs_ha_namenode_ids, due to Configuration parameter 'dfs.ha.namenodes' was not found in configurations dictionary!
2025-11-18 19:50:26,454 - Skipping param: falcon_user, due to Configuration parameter 'falcon-env' was not found in configurations dictionary!
2025-11-18 19:50:26,454 - Skipping param: gmetad_user, due to Configuration parameter 'ganglia-env' was not found in configurations dictionary!
2025-11-18 19:50:26,454 - Skipping param: gmond_user, due to Configuration parameter 'ganglia-env' was not found in configurations dictionary!
2025-11-18 19:50:26,454 - Skipping param: oozie_user, due to Configuration parameter 'oozie-env' was not found in configurations dictionary!
2025-11-18 19:50:26,454 - Skipping param: ranger_group, due to Configuration parameter 'ranger-env' was not found in configurations dictionary!
2025-11-18 19:50:26,454 - Skipping param: ranger_user, due to Configuration parameter 'ranger-env' was not found in configurations dictionary!
2025-11-18 19:50:26,454 - Skipping param: repo_info, due to Configuration parameter 'repoInfo' was not found in configurations dictionary!
2025-11-18 19:50:26,454 - Group['flink'] {}
2025-11-18 19:50:26,456 - Adding group Group['flink']
2025-11-18 19:50:26,468 - Group['livy'] {}
2025-11-18 19:50:26,469 - Adding group Group['livy']
2025-11-18 19:50:26,480 - Group['spark'] {}
2025-11-18 19:50:26,480 - Adding group Group['spark']
2025-11-18 19:50:26,492 - Group['hdfs'] {}
2025-11-18 19:50:26,492 - Group['zeppelin'] {}
2025-11-18 19:50:26,493 - Adding group Group['zeppelin']
2025-11-18 19:50:26,503 - Group['hadoop'] {}
2025-11-18 19:50:26,504 - Group['users'] {}
2025-11-18 19:50:26,505 - User['yarn-ats'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,506 - User['hive'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,506 - User['infra-solr'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,507 - User['httpfs'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,508 - User['zookeeper'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,509 - User['ams'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,509 - User['tez'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop', 'users'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,510 - User['zeppelin'] {'uid': None, 'gid': 'hadoop', 'groups': ['zeppelin', 'hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,510 - Adding user User['zeppelin']
2025-11-18 19:50:26,528 - User['ozone'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,529 - User['flink'] {'uid': None, 'gid': 'hadoop', 'groups': ['flink', 'hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,529 - Adding user User['flink']
2025-11-18 19:50:26,546 - User['logsearch'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,547 - User['livy'] {'uid': None, 'gid': 'hadoop', 'groups': ['livy', 'hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,547 - Adding user User['livy']
2025-11-18 19:50:26,564 - User['spark'] {'uid': None, 'gid': 'hadoop', 'groups': ['spark', 'hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,564 - Adding user User['spark']
2025-11-18 19:50:26,580 - User['ambari-qa'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop', 'users'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,581 - User['kafka'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,582 - User['hdfs'] {'uid': None, 'gid': 'hadoop', 'groups': ['hdfs', 'hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,582 - User['sqoop'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,583 - User['yarn'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,584 - User['mapred'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,584 - User['hbase'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,585 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0o555}
2025-11-18 19:50:26,586 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2025-11-18 19:50:26,590 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2025-11-18 19:50:26,590 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'mode': 0o775, 'create_parents': True, 'cd_access': 'a'}
2025-11-18 19:50:26,591 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0o555}
2025-11-18 19:50:26,591 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0o555}
2025-11-18 19:50:26,591 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2025-11-18 19:50:26,597 - call returned (0, '1010')
2025-11-18 19:50:26,597 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1010'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2025-11-18 19:50:26,601 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1010'] due to not_if
2025-11-18 19:50:26,601 - Group['hdfs'] {}
2025-11-18 19:50:26,602 - User['hdfs'] {'groups': ['hdfs', 'hadoop', 'hdfs'], 'fetch_nonlocal_groups': True}
2025-11-18 19:50:26,602 - FS Type: HDFS
2025-11-18 19:50:26,602 - Directory['/etc/hadoop'] {'mode': 0o755}
2025-11-18 19:50:26,608 - File['/usr/odp/1.3.1.0-174/hadoop/conf/hadoop-env.sh'] {'owner': 'hdfs', 'group': 'hadoop', 'content': InlineTemplate(...)}
2025-11-18 19:50:26,609 - Writing File['/usr/odp/1.3.1.0-174/hadoop/conf/hadoop-env.sh'] because contents don't match
2025-11-18 19:50:26,609 - Changing owner for /tmp/tmp1763482826.6092613_372 from 0 to hdfs
2025-11-18 19:50:26,609 - Changing group for /tmp/tmp1763482826.6092613_372 from 0 to hadoop
2025-11-18 19:50:26,609 - Moving /tmp/tmp1763482826.6092613_372 to /usr/odp/1.3.1.0-174/hadoop/conf/hadoop-env.sh
2025-11-18 19:50:26,613 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0o1777}
2025-11-18 19:50:26,634 - Skipping param: gmetad_user, due to Configuration parameter 'ganglia-env' was not found in configurations dictionary!
2025-11-18 19:50:26,634 - Skipping param: gmond_user, due to Configuration parameter 'ganglia-env' was not found in configurations dictionary!
2025-11-18 19:50:26,634 - Skipping param: repo_info, due to Configuration parameter 'repoInfo' was not found in configurations dictionary!
2025-11-18 19:50:26,634 - Repository for ODP/1.3.1.0-174/ODP-1.3 is not managed by Ambari
2025-11-18 19:50:26,634 - Repository[None] {'action': ['create']}
2025-11-18 19:50:26,636 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2025-11-18 19:50:26,783 - Skipping installation of existing package unzip
2025-11-18 19:50:26,783 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2025-11-18 19:50:26,856 - Skipping installation of existing package curl
2025-11-18 19:50:26,857 - The repository with version 1.3.1.0-174 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2025-11-18 19:50:27,099 - Package['flink_1_3_1_0_174'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2025-11-18 19:50:27,281 - Installing package flink_1_3_1_0_174 ('/usr/bin/yum -y install flink_1_3_1_0_174')
2025-11-18 19:50:29,872 - Using hadoop conf dir: /usr/odp/1.3.1.0-174/hadoop/conf
2025-11-18 19:50:29,875 - Using hadoop conf dir: /usr/odp/1.3.1.0-174/hadoop/conf
2025-11-18 19:50:29,877 - The repository with version 1.3.1.0-174 for this command has been marked as resolved. It will be used to report the version of the component which was installed

Command failed after 1 tries

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions