国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看

合肥生活安徽新聞合肥交通合肥房產生活服務合肥教育合肥招聘合肥旅游文化藝術合肥美食合肥地圖合肥社保合肥醫院企業服務合肥法律

代寫EMS5730、代做Python設計程序

時間:2024-02-14  來源:合肥網hfw.cc  作者:hfw.cc 我要糾錯



EMS5**0 Spring 2024 Homework #0
Release date: Jan 10, 2024
Due date: Jan 21, 2024 (Sunday) 23:59 pm
(Note: The course add-drop period ends at 5:30 pm on Jan 22.)
No late homework will be accepted!
Every Student MUST include the following statement, together with his/her signature in the
submitted homework.
I declare that the assignment submitted on the Elearning system is
original except for source material explicitly acknowledged, and that the
same or related material has not been previously submitted for another
course. I also acknowledge that I am aware of University policy and
regulations on honesty in academic work, and of the disciplinary
guidelines and procedures applicable to breaches of such policy and
regulations, as contained in the website
Submission notice:
● Submit your homework via the elearning system
General homework policies:
A student may discuss the problems with others. However, the work a student turns in must
be created COMPLETELY by oneself ALONE. A student may not share ANY written work or
pictures, nor may one copy answers from any source other than one’s own brain.
Each student MUST LIST on the homework paper the name of every person he/she has
discussed or worked with. If the answer includes content from any other source, the
student MUST STATE THE SOURCE. Failure to do so is cheating and will result in
sanctions. Copying answers from someone else is cheating even if one lists their name(s) on
the homework.
If there is information you need to solve a problem but the information is not stated in the
problem, try to find the data somewhere. If you cannot find it, state what data you need,
make a reasonable estimate of its value and justify any assumptions you make. You will be
graded not only on whether your answer is correct, but also on whether you have done an
intelligent analysis.
Q0 [10 marks]: Secure Virtual Machines Setup on the Cloud
In this task, you are required to set up virtual machines (VMs) on a cloud computing
platform. While you are free to choose any cloud platform, Google Cloud is recommended.
References [1] and [2] provide the tutorial for Google Cloud and Amazon AWS, respectively.
The default network settings in each cloud platform are insecure. Your VM can be hacked
by external users, resulting in resource overuse which may charge your credit card a
big bill of up to $5,000 USD. To protect your VMs from being hacked and prevent any
financial losses, you should set up secure network configurations for all your VMs.
In this part, you need to set up a whitelist for your VMs. You can choose one of the options
from the following choices to set up your whitelist: 1. only the IP corresponding to your
current device can access your VMs via SSH. Traffic from other sources should be blocked.
2. only users in the CUHK network can access your VMs via SSH. Traffic outside CUHK
should be blocked. You can connect to CUHK VPN to ensure you are in the CUHK network
(IP Range: 137.189.0.0/16). Reference [3] provides the CUHK VPN setup information from
ITSC.
a. [10 marks] Secure Virtual Machine Setup
Reference [4] and [5] are the user guides for the network security configuration of
AWS and Google Cloud, respectively. You can go through the document with respect
to the cloud platform you use. Then follow the listed steps to configure your VM’s
network:
i. locate or create the security group/ firewall of your VM;
ii. remove all rules of inbound/ ingress and outbound/ egress, except for the
default rule(s) responsible for internal access within the cloud platform;
iii. add a new rule to the inbound/ ingress, with the SSH port(s) of VMs (default:
22) and source specified, e.g., ‘137.189.0.0/16’ for CUHK users only;
iv. (Optional) more ports may be further permitted based on your needs (e.g.,
when completing Q1 below).
Q1 [** marks + 20 bonus marks]: Hadoop Cluster Setup
Hadoop is an open-source software framework for distributed storage and processing. In this
problem, you are required to set up a Hadoop cluster using the VMs you instantiated in Q0.
In order to set up a Hadoop cluster with multiple virtual machines (VM), you can set up a
single-node Hadoop cluster for each VM first [6]. Then modify the configuration file in each
node to set up a Hadoop cluster with multiple nodes. References [7], [9], [10], [11] provide
the setup instructions for a Hadoop cluster. Some important notes/ tips on instantiating VMs
are given at the end of this section.
a. [20 marks] Single-node Hadoop Setup
In this part, you need to set up a single-node Hadoop cluster in a pseudo-distributed
mode and run the Terasort example on your Hadoop cluster.
i. Set up a single-node Hadoop cluster (recommended Hadoop version: 2.9.x,
all versions available in [16]). Attach the screenshot of http://localhost:50070
(or http://:50070 if opened in the browser of your local machine) to
verify that your installation is successful.
ii. After installing a single-node Hadoop cluster, you need to run the Terasort
example [8] on it. You need to record all your key steps, including your
commands and output. The following commands may be useful:
$ ./bin/hadoop jar \
./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.9.2.jar \
teragen 120000 terasort/input
//generate the data for sorting
$ ./bin/hadoop jar \
./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.9.2.jar \
terasort terasort/input terasort/output
//terasort the generated data
$ ./bin/hadoop jar \
./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.9.2.jar \
teravalidate terasort/output terasort/check
//validate the output is sorted
Notes: To monitor the Hadoop service via Hadoop NameNode WebUI (http://ip>:50070) on your local browser, based on steps in Q0, you may further allow traffic
from CUHK network to access port 50070 of VMs.
b. [40 marks] Multi-node Hadoop Cluster Setup
After the setup of a single-node Hadoop cluster in each VM, you can modify the
configuration files in each node to set up the multi-node Hadoop cluster.
i. Install and set up a multi-node Hadoop cluster with 4 VMs (1 Master and 3
Slaves). Use the ‘jps’ command to verify all the processes are running.
ii. In this part, you need to use the ‘teragen’ command to generate 2 different
datasets to serve as the input for the Terasort program. You should use the
following two rules to determine the size of the two datasets of your own:
■ Size of dataset 1: (Your student ID % 3 + 1) GB
■ Size of dataset 2: (Your student ID % 20 + 10) GB
Then, run the Terasort code again for these two different datasets and
compare their running time.
Hints: Keep an image for your Hadoop cluster. You would need to use the Hadoop
cluster again for subsequent homework assignments.
Notes:
1. You may need to add each VM to the whitelist of your security group/ firewall
and further allow traffic towards more ports needed by Hadoop/YARN
services (reference [17] [18]).
2. For step i, the resulting cluster should consist of 1 namenode and 4
datanodes. More precisely, 1 namenode and 1 datanode would be running on
the master machine, and each slave machine runs one datanode.
3. Please ensure that after the cluster setup, the number of “Live Nodes” shown
on Hadoop NameNode WebUI (port 50070) is 4.
c. [30 marks] Running Python Code on Hadoop
Hadoop streaming is a utility that comes with the Hadoop distribution. This utility
allows you to create and run MapReduce jobs with any executable or script as the
mapper and/or the reducer. In this part, you need to run the Python wordcount script
to handle the Shakespeare dataset [12] via Hadoop streaming.
i. Reference [13] introduces the method to run a Python wordcount script via
Hadoop streaming. You can also download the script from the reference [14].
ii. Run the Python wordcount script and record the running time. The following
command may be useful:
$ ./bin/hadoop jar \
./share/hadoop/tools/lib/hadoop-streaming-2.9.2.jar \
-file mapper.py -mapper mapper.py \
-file reducer.py -reducer reducer.py \
-input input/* \
-output output
//submit a Python program via Hadoop streaming
d. [Bonus 20 marks] Compiling the Java WordCount program for MapReduce
The Hadoop framework is written in Java. You can easily compile and submit a Java
MapReduce job. In this part, you need to compile and run your own Java wordcount
program to process the Shakespeare dataset [12].
i. In order to compile the Java MapReduce program, you may need to use
“hadoop classpath” command to fetch the list of all Hadoop jars. Or you can
simply copy all dependency jars in a directory and use them for compilation.
Reference [15] introduces the method to compile and run a Java wordcount
program in the Hadoop cluster. You can also download the Java wordcount
program from reference [14].
ii. Run the Java wordcount program and compare the running time with part c.
Part (d) is a bonus question for IERG 4300 but required for ESTR 4300.
IMPORTANT NOTES:
1. Since AWS will not provide free credits anymore, we recommend you to use Google
Cloud (which offers a **-day, $300 free trial) for this homework.
2. If you use Putty for SSH client, please download from the website
https://www.putty.org/ and avoid using the default private key. Failure to do so will
subject your AWS account/ Hadoop cluster to hijacking.
3. Launching instances with Ubuntu (version >= 18.04 LTS) is recommended. Hadoop
version 2.9.x is recommended. Older versions of Hadoop may have vulnerabilities
that can be exploited by hackers to launch DoS attacks.
4. (AWS) For each VM, you are recommended to use the t2.large instance type with
100GB hard disk, which consists of 2 CPU cores and 8GB RAM.
5. (Google) For each VM, you are recommended to use the n2-standard-2 instance
type with 100GB hard disk, which consists of 2 CPU cores and 8GB RAM.
6. When following the given references, you may need to modify the commands
according to your own environment, e.g., file location, etc.
7. After installing a single-node Hadoop, you can save the system image and launch
multiple copies of the VM with that image. This can simplify your process of installing
the single-node Hadoop cluster on each VM.
8. Keep an image for your Hadoop cluster. You will need to use the Hadoop cluster
again for subsequent homework assignments.
9. Always refer to the logs for debugging single/multi-node Hadoop setup, which
contains more details than CLI outputs.
10. Please shut down (not to terminate) your VMs when you are not using them. This can
save you some credits and avoid being attacked when your VMs are idle.
Submission Requirements:
1. Include all the key steps/ commands, your cluster configuration details, source codes
of your programs, your compiling steps (if any), etc., together with screenshots, into a
SINGLE PDF report. Your report should also include the signed declaration (the first
page of this homework file).
2. Package all the source codes (as you included in step 1) into a zip file individually.
3. You should submit two individual files: your homework report (in PDF format) and a
zip file packaged all the codes of your homework.
4. Please submit your homework report and code zip file through the Blackboard
system. No email submission is allowed.
如有需要,請加QQ:99515681 或WX:codehelp

掃一掃在手機打開當前頁
  • 上一篇:代做CSCI3280、Python設計編程代寫
  • 下一篇:代寫CS 476/676 程序
  • 無相關信息
    合肥生活資訊

    合肥圖文信息
    流體仿真外包多少錢_專業CFD分析代做_友商科技CAE仿真
    流體仿真外包多少錢_專業CFD分析代做_友商科
    CAE仿真分析代做公司 CFD流體仿真服務 管路流場仿真外包
    CAE仿真分析代做公司 CFD流體仿真服務 管路
    流體CFD仿真分析_代做咨詢服務_Fluent 仿真技術服務
    流體CFD仿真分析_代做咨詢服務_Fluent 仿真
    結構仿真分析服務_CAE代做咨詢外包_剛強度疲勞振動
    結構仿真分析服務_CAE代做咨詢外包_剛強度疲
    流體cfd仿真分析服務 7類仿真分析代做服務40個行業
    流體cfd仿真分析服務 7類仿真分析代做服務4
    超全面的拼多多電商運營技巧,多多開團助手,多多出評軟件徽y1698861
    超全面的拼多多電商運營技巧,多多開團助手
    CAE有限元仿真分析團隊,2026仿真代做咨詢服務平臺
    CAE有限元仿真分析團隊,2026仿真代做咨詢服
    釘釘簽到打卡位置修改神器,2026怎么修改定位在范圍內
    釘釘簽到打卡位置修改神器,2026怎么修改定
  • 短信驗證碼 寵物飼養 十大衛浴品牌排行 suno 豆包網頁版入口 wps 目錄網 排行網

    關于我們 | 打賞支持 | 廣告服務 | 聯系我們 | 網站地圖 | 免責聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網 版權所有
    ICP備06013414號-3 公安備 42010502001045

    国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看
    久久影院理伦片| 九一国产精品视频| 午夜午夜精品一区二区三区文| 久久亚洲精品毛片| 国产精品国产三级国产aⅴ浪潮| 日韩最新免费不卡| 久久精品久久久久久| 久久精品一区二区三区不卡免费视频| 99爱精品视频| 91精品成人久久| 久久综合色一本| 久久av一区二区| 色青青草原桃花久久综合| 久久久99免费视频| 国产精品免费入口| 久久成人一区二区| 欧美极品欧美精品欧美视频| 亚洲精品日韩在线观看| 天堂v在线视频| 日韩国产一区久久| 欧美国产日韩激情| 国产午夜精品一区| 国产日韩精品入口| 成人福利视频网| 国产夫妻自拍一区| 国产精品美女久久久久av福利| 久久综合久久美利坚合众国| 亚洲午夜精品国产| 日本成人精品在线| 免费国产成人av| av免费观看网| 久久久久资源| 久久躁狠狠躁夜夜爽| 亚洲自偷自拍熟女另类| 日本高清+成人网在线观看| 欧美亚洲另类激情另类| 国产日韩一区二区在线观看| 91精品91久久久中77777老牛| 国产成人激情视频| 国产精品国语对白| 午夜精品在线观看| 黄色一级片在线看| 91久久精品国产91久久| 久久精品亚洲国产| 亚洲一区二区三区在线免费观看 | 国产欧美日韩综合精品| 91国产美女视频| 久久精品小视频| 午夜久久资源| 国内精久久久久久久久久人| 91久久精品美女| 国产精品国产三级国产专区53| 亚洲精品一品区二品区三品区| 欧美亚洲另类制服自拍| www.av毛片| 国产精品久久久久影院日本| 亚洲巨乳在线观看| 蜜桃视频在线观看91| 国产成人精品久久亚洲高清不卡 | 欧美一级黄色网| 国产日韩视频在线观看| 国产高清在线不卡| 精品中文字幕在线2019| 人妻无码久久一区二区三区免费| 国产拍精品一二三| 国产成人无码一二三区视频| 亚洲v日韩v欧美v综合| 国产一区二区三区av在线| 久久精品日产第一区二区三区精品版 | 精品国产成人av在线免| 日韩午夜视频在线观看| 成人免费视频97| 国产精品丝袜久久久久久高清| 亚洲 欧美 综合 另类 中字| 国产嫩草一区二区三区在线观看| 久久精品视频在线观看| 日韩美女在线观看| 国产成人精品福利一区二区三区 | 亚洲中文字幕久久精品无码喷水| 欧美激情专区| 久久久久久久久久久综合| 午夜免费福利小电影| 国产欧美综合一区| 国产精品成人免费视频| 欧美成人蜜桃| 久久久国产精彩视频美女艺术照福利| 日本一级黄视频| 国产z一区二区三区| 日日骚一区二区网站| 国产精品99久久免费黑人人妻| 亚洲在线不卡| 91精品国产电影| 亚洲中文字幕无码av永久| 国产精品一国产精品最新章节| 精品国产一区二区三区无码| 国产主播在线一区| 欧美理论电影在线观看| 狠狠色综合色区| 国产精品久久久久aaaa九色| 麻豆一区区三区四区产品精品蜜桃 | 国产精品一区二区久久久| 欧美日韩国产成人| 国产美女精彩久久| 国产999视频| 成人免费福利视频| 亚洲五码在线观看视频| 国产精彩视频一区二区| 视频在线99re| 久久精品99国产| 欧美亚洲视频在线观看| 国产成人女人毛片视频在线| 青青青国产在线观看| 国产精品网址在线| 国产一区二区免费在线观看| 中文字幕色呦呦| 久久综合一区二区三区| 日本www高清视频| 久久精品免费播放| 国产欧美日韩一区二区三区| 亚洲欧洲在线一区| 久久久久久久久国产精品| 国内精品视频在线播放| 国产99视频精品免视看7| 91好吊色国产欧美日韩在线| 日韩精品第1页| 久久久国产精品x99av| 国产午夜大地久久| 午夜精品久久久99热福利| 久久久久久国产精品一区| 精品一区国产| 无码人妻aⅴ一区二区三区日本| 精品国产一区二区三区久久久狼| 国产视频不卡| 日日噜噜噜夜夜爽爽| 国产精品视频成人| 国产乱人伦精品一区二区三区| 婷婷久久青草热一区二区| 国产精品美女久久久久av超清| 99国内精品久久久久久久软件| 日韩av不卡在线| 国产精品福利在线观看网址| 91精品视频在线免费观看| 欧美日韩精品免费看| 一本色道久久99精品综合| 久久久国产一区| 91精品国产高清久久久久久91 | 国产一区一区三区| 日本久久久久久久久| 久久躁日日躁aaaaxxxx| 久久天天狠狠| 国产日韩av高清| 欧美一区二区视频在线播放| 一级特黄妇女高潮| www.欧美精品一二三区| 成人免费福利在线| 狠狠色综合一区二区| 日韩资源av在线| 欧美日韩高清在线观看| 久久久久99精品久久久久| 91精品久久久久久久久青青 | 97人人模人人爽人人喊38tv| 海角国产乱辈乱精品视频| 日韩中字在线观看| 国产av不卡一区二区| 久久精品中文字幕免费mv| 久久久女人电视剧免费播放下载| 国产欧美亚洲日本| 精品欧美一区二区在线观看视频| 亚洲va久久久噜噜噜| 久久久久久69| 久久av中文字幕| 国产精品欧美日韩久久| 久久超碰亚洲| 久久免费视频2| 91高清免费视频| 古典武侠综合av第一页| 国产尤物av一区二区三区| 欧美日韩在线成人| 日本精品一区在线观看| 亚洲成人午夜在线| 中文字幕一区二区三区在线乱码| 国产精品后入内射日本在线观看| 国产精品视频入口| 久久韩国免费视频| xxxx性欧美| 久久深夜福利免费观看| 久久精品2019中文字幕| 国产成人无码av在线播放dvd | 色琪琪综合男人的天堂aⅴ视频| 国产精品18毛片一区二区| 91精品国产综合久久香蕉的用户体验 | 国产精品第一第二| 国产精品国产对白熟妇| 国产精品久久久久久久天堂第1集| 九色自拍视频在线观看| 久久亚洲午夜电影| 国产成人精品视频| 久久99久久99精品| 久久av一区二区三区漫画| 久久久久久久久久福利|