數(shù)據(jù)分析 (1人)
10000-14999元/月
崗位職責(zé)
1. and maintain etl pipelines using airflow/python
2.build and b*** data pipelines to maintain kargo’s data lake
3.ensure that users have the r***ht level of access to our data assets
4.coordinate with the ins***hts team to extract data for adhoc reporting
5.des***n and impactful visualization using a variety of bi tools
6.detect and clean any deviation in the data
7municate technical and business topics‚ as a***ropriate‚ using wri***en‚ verbal and/or presentation materials as necessary
相關(guān)經(jīng)驗、技能要求
1.experience in b*** data processing using apache hadoop/spark ecosystem a***lications like hadoop‚ hive‚ spark‚ kafka and hdfs preferable
2.must have strong experience in data warehouse etl des***n and development‚ methodologies‚ tools‚ processes and best practices
3.strong experience in stellar da***oards and reports creation for c-level utives
4.you should love and be passionate about data. you should be able to demonstrate your experience with engaging with truly large data sets.
5.strong experience in cloud technologies like aws‚ azure or aliyun
6.strong experience in creating data pipelines using python‚ airflow or similar etl tools
7.strong experience in data modelling
8.expert sql skills
9.good *****mand of linux
10.have a strong learning ability. you can demonstrate the skills required to research and gain extensive knowledge about a subject and an a***reciable level of expertise on the subject.
11.a bachelor degree is required unless you have substantial work experience or real-life experience.
12.good *****munication skills in english and chinese.
提示:用人單位以各種理由收取非正規(guī)費用(押金、服裝費、報名費、培訓(xùn)費、體檢費、要求購買充值卡、刷信譽、淘寶刷鉆、YY網(wǎng)絡(luò)兼職、加YY聯(lián)系等)均有騙子嫌疑,請?zhí)岣呔琛?a href="javascript:;" onclick="showlg('規(guī)避求職風(fēng)險','/person/rhfzsphd.htm',420,500,2);" class="admin">查看詳情
聯(lián)系方式/地圖
- 公司地址:上海市 靜安****
- 郵政編碼:********
- 聯(lián)系電話:********
- 聯(lián) 系 人:********
- 傳 真:********
- E-mail:********
您可能感興趣的職位
- 職位名稱
- 學(xué)歷要求
- 工作地區(qū)
- 薪資待遇
- 操作