I have recently installed airflow for my workflows. While creating my project, I executed following command:

airflow initdb

which returned following error:

[2016-08-15 11:17:00,314] {__init__.py:36} INFO - Using executor SequentialExecutor
DB: sqlite:////Users/mikhilraj/airflow/airflow.db
[2016-08-15 11:17:01,319] {db.py:222} INFO - Creating tables
INFO  [alembic.runtime.migration] Context impl SQLiteImpl.
INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
ERROR [airflow.models.DagBag] Failed to import: /usr/local/lib/python2.7/site-packages/airflow/example_dags/example_twitter_dag.py
Traceback (most recent call last):
    File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 247, in process_file
       m = imp.load_source(mod_name, file path)
    File "/usr/local/lib/python2.7/site-packages/airflow/example_dags/example_twitter_dag.py", line 26, in <module>
       from airflow.operators import BashOperator, HiveOperator, PythonOperator
ImportError: cannot import name HiveOperator
Done.

I checked some similar issues on web, which suggested me to install airflow[hive]pyhs2 but it doesn't seem to work.

 

Are you using the HiveOperator? It seems like the error you are getting is due to 1 of the example dags. In production you should probably set load_examples to False and install airflow[hive] only if you are using the HiveOperator.

That being said, not sure why airflow[hive] isn't enough for you. You may try installing airflow[hive,hdfs,jdbc] but the airflow[hive] should be enough to get rid of the HiveOperatorimport error. Could you perhaps add what other error you are getting?

shareimprove this answer
 
    
Seems like that is the issue. On production airflow[hive] worked for me. Can you tell me how to set load_examples to False. – Rusty Aug 16 '16 at 8:18
2  
Check out the airflow.cfg file. Airflow automatically creates the default airflow.cfg file for you in the AIRFLOW_HOME dir. The file has a variable load_examples which by default is set to True – Vineet GoelAug 16 '16 at 18:41 
    
yeah. This worked on my local also. – Rusty Aug 17 '16 at 7:40
3  
The command pip install airflow[hive] was sufficient to resolve the error on a fresh install for me. – Taylor Edmiston Nov 14 '16 at 21:01