国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看

合肥生活安徽新聞合肥交通合肥房產生活服務合肥教育合肥招聘合肥旅游文化藝術合肥美食合肥地圖合肥社保合肥醫院企業服務合肥法律

DDA3020代做、代寫Python語言編程
DDA3020代做、代寫Python語言編程

時間:2024-10-12  來源:合肥網hfw.cc  作者:hfw.cc 我要糾錯



DDA3020 Homework 1
Due date: Oct 14, 2024
Instructions
• The deadline is 23:59, Oct 14, 2024.
• The weight of this assignment in the ffnal grade is 20%.
• Electronic submission: Turn in solutions electronically via Blackboard. Be sure to submit
 your homework as one pdf ffle plus two python scripts. Please name your solution ffles as
”DDA3020HW1 studentID name.pdf”, ”HW1 yourID Q1.ipynb” and ”HW1 yourID Q2.ipynb”.
(.py ffles also acceptable)
• Note that late submissions will result in discounted scores: 0-24 hours → 80%, 24-120 hours
→ 50%, 120 or more hours → 0%.
• Answer the questions in English. Otherwise, you’ll lose half of the points.
• Collaboration policy: You need to solve all questions independently and collaboration between
students is NOT allowed.
1 Written Problems (50 points)
1.1. (Learning of Linear Regression, 25 points) Suppose we have training data:
{(x1, y1),(x2, y2), . . . ,(xN , yN )},
where xi ∈ R
d and yi ∈ R
k
, i = 1, 2, . . . , N.
i) (9 pts) Find the closed-form solution of the following problem.
min
W,b
X
N
i=1
∥yi − Wxi − b∥
2
2
,
ii) (8 pts) Show how to use gradient descent to solve the problem. (Please state at least one
possible Stopping Criterion)
1DDA3020 Machine Learning Autumn 2024, CUHKSZ
iii) (8 pts) We further suppose that x1, x2, . . . , xN are drawn from N (µ, σ
2
). Show that the
maximum likelihood estimation (MLE) of σ
2
is σˆ
2
MLE =
1
N
PN
n=1
(xn − µMLE)
2
.
1.2. (Support Vector Machine, 25 points) Given two positive samples x1 = (3, 3)
T
, x2 =
(4, 3)
T
, and one negative sample x3 = (1, 1)
T
, ffnd the maximum-margin separating hyperplane and
support vectors.
Solution steps:
i) Formulating the Optimization Problem (5 pts)
ii) Constructing the Lagrangian (5 pts)
iii) Using KKT Conditions (5 pts)
iv) Solving the Equations (5 pts)
v) Determining the Hyperplane Equation and Support Vectors (5 pts)
2 Programming (50 points)
2.1. (Linear regression, 25 points) We have a labeled dataset D = {(x1, y1),(x2, y2),
· · · ,(xn, yn)}, with xi ∈ R
d being the d-dimensional feature vector of the i-th sample, and yi ∈ R
being real valued target (label).
A linear regression model is give by
fw0,...,wd
(x) = w0 + w1x1 + w2x2 + · · · + wdxd, (1)
where w0 is often called bias and w1, w2, . . . , wd are often called coefffcients.
Now, we want to utilize the dataset D to build a linear model based on linear regression.
We provide a training set Dtrain that includes 2024 labeled samples with 11 features (See linear
 regression train.txt) to fft model, and a test set Dtest that includes 10 unlabeled samples with
11 features (see linear regression test.txt) to estimate model.
1. Using the LinearRegression class from Sklearn package to get the bias w0 and the coefffcients
w1, w2, . . . , w11, then computing the yˆ = f(x) of test set Dtest by the model trained well. (Put
the estimation of w0, w1, . . . , w11 and these yˆ in your answers.)
2. Implementing the linear regression by yourself to obtain the bias w0 and the coefffcients
w1, w2, . . . , w11, then computing the yˆ = f(x) of test set Dtest. (Put the estimation of
w0, w1, . . . , w11 and these yˆ in your answers. It is allowed to compute the inverse of a matrix
using the existing python package.)
2DDA3020 Machine Learning Autumn 2024, CUHKSZ
(Hint: Note that for linear regression train.txt, there are 2024 rows with 12 columns where the
ffrst 11 columns are features x and the last column is target y and linear regression test.txt
only contains 10 rows with 11 columns (features). Both of two tasks require the submission of
code and results. Put all the code in a “HW1 yourID Q1.ipynb” Jupyter notebook. ffle.(”.py”
ffle is also acceptable))
2.2. (SVM, 25 points)
Task Description You are asked to write a program that constructs support vector machine
models with different kernel functions and slack variables.
Datasets You are provided with the iris dataset. The data set contains 3 classes of 50 instances
each, where each class refers to a type of iris plant. There are four features: 1. sepal length in cm;
2. sepal width in cm; 3. petal length in cm; 4. petal width in cm. You need to use these features
to classify each iris plant as one of the three possible types.
What you should do You should use the SVM function from python sklearn package, which
provides various forms of SVM functions. For multiclass SVM you should use the one vs rest
strategy. You are recommended to use sklearn.svm.svc() function. You can use numpy for vector
manipulation. For technical report, you should report the results required as mentioned below (e.g.
training error, testing error, and so on).
1. (2 points) Split training set and test set. Split the data into a training set and a test set.
The training set should contain 70% of the samples, while the test set should include 30%.
The number of samples from each category in both the training and test sets should reffect
this 70-30 split; for each category, the ffrst 70% of the samples will form the training set, and
the remaining 30% will form the test set. Ensure that the split maintains the original order
of the data. You should report instance ids in the split training set and test set. The output
format is as follows:
Q2.2.1 Split training set and test set:
Training set: xx
Test set: xx
You should ffll up xx in the template. You should write ids for each set in the same line with
comma separated, e.g. Training set:[1, 4, 19].
2. (10 points) Calculation using Standard SVM Model (Linear Kernel). Employ the
standard SVM model with a linear kernel. Train your SVM on the split training dataset and
3DDA3020 Machine Learning Autumn 2024, CUHKSZ
validate it on the testing dataset. Calculate the classiffcation error for both the training and
testing datasets, output the weight vector w, the bias b, and the indices of support vectors
(start with 0). Note that the scikit-learn package does not offer a function with hard margin,
so we will simulate this using C = 1e5. You should ffrst print out the total training error
and testing error, where the error is
wrong prediction
number of data
. Then, print out the results for each class
separately (note that you should calculate errors for each class separately in this part). You
should also mention in your report which classes are linear separable with SVM without slack.
The output format is as follows:
Q2.2.2 Calculation using Standard SVM Model:
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
Linear separable classes: xx
If we view the one vs all strategy as combining the multiple different SVM, each one being
a separating hyperplane for one class and the rest of the points, then the w, b and support
vector indices for that class is the corresponding parameters for the SVM separating this class
and the rest of the points. If a variable is of vector form, say a =


1
2
3
?**4;
?**5;?**5;?**6;, then you should write
each entry in the same line with comma separated e.g. [1,2,3].
3. (6 points) Calculation using SVM with Slack Variables (Linear Kernel). For each
C = 0.25 × t, where t = 1, 2, . . . , 4, train your SVM on the training dataset, and subsequently
validate it on the testing dataset. Calculate the classiffcation error for both the training and
testing datasets, the weight vector w, the bias b, and the indices of support vectors, and the
slack variable ζ of support vectors (you may compute it as max(0, 1 − y · f(X)). The output
format is as follows:
Q2.2.3 Calculation using SVM with Slack Variables (C = 0.25 × t, where t = 1, . . . , 4):
4DDA3020 Machine Learning Autumn 2024, CUHKSZ
-------------------------------------------
C=0.25,
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
-------------------------------------------
C=0.5,
<... results for (C=0.5) ...>
-------------------------------------------
C=0.75,
<... results for (C=0.75) ...>
-------------------------------------------
C=1,
<... results for (C=1) ...>
4. (7 points) Calculation using SVM with Kernel Functions. Conduct experiments with
different kernel functions for SVM without slack variable. Calculate the classiffcation error
for both the training and testing datasets, and the indices of support vectors for each kernel
type:
(a) 2nd-order Polynomial Kernel
(b) 3nd-order Polynomial Kernel
(c) Radial Basis Function Kernel with σ = 1
(d) Sigmoidal Kernel with σ = 1
The output format is as follows:
5DDA3020 Machine Learning Autumn 2024, CUHKSZ
Q2.2.4 Calculation using SVM with Kernel Functions:
-------------------------------------------
(a) 2nd-order Polynomial Kernel,
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
-------------------------------------------
(b) 3nd-order Polynomial Kernel,
<... results for (b) ...>
-------------------------------------------
(c) Radial Basis Function Kernel with σ = 1,
<... results for (c) ...>
-------------------------------------------
(d) Sigmoidal Kernel with σ = 1,
<... results for (d) ...>
Submission Submit your executable code in a “HW1 yourID Q2.ipynb” Jupyter notebook(”.py”
file is also acceptable). Indicate the corresponding question number in the comment for each cell,
and ensure that your code can logically produce the required results for each question in the required
format. Please note that you need to write clear comments and use appropriate function/variable
names. Excessively unreadable code may result in point deductions.

6

請加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp




 

掃一掃在手機打開當前頁
  • 上一篇:代做CS 259、Java/c++設計程序代寫
  • 下一篇:代做MSE 280、代寫Matlab程序語言
  • 無相關信息
    合肥生活資訊

    合肥圖文信息
    流體仿真外包多少錢_專業CFD分析代做_友商科技CAE仿真
    流體仿真外包多少錢_專業CFD分析代做_友商科
    CAE仿真分析代做公司 CFD流體仿真服務 管路流場仿真外包
    CAE仿真分析代做公司 CFD流體仿真服務 管路
    流體CFD仿真分析_代做咨詢服務_Fluent 仿真技術服務
    流體CFD仿真分析_代做咨詢服務_Fluent 仿真
    結構仿真分析服務_CAE代做咨詢外包_剛強度疲勞振動
    結構仿真分析服務_CAE代做咨詢外包_剛強度疲
    流體cfd仿真分析服務 7類仿真分析代做服務40個行業
    流體cfd仿真分析服務 7類仿真分析代做服務4
    超全面的拼多多電商運營技巧,多多開團助手,多多出評軟件徽y1698861
    超全面的拼多多電商運營技巧,多多開團助手
    CAE有限元仿真分析團隊,2026仿真代做咨詢服務平臺
    CAE有限元仿真分析團隊,2026仿真代做咨詢服
    釘釘簽到打卡位置修改神器,2026怎么修改定位在范圍內
    釘釘簽到打卡位置修改神器,2026怎么修改定
  • 短信驗證碼 寵物飼養 十大衛浴品牌排行 suno 豆包網頁版入口 wps 目錄網 排行網

    關于我們 | 打賞支持 | 廣告服務 | 聯系我們 | 網站地圖 | 免責聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網 版權所有
    ICP備06013414號-3 公安備 42010502001045

    国产人妻人伦精品_欧美一区二区三区图_亚洲欧洲久久_日韩美女av在线免费观看
    国产午夜福利视频在线观看| 91久久久在线| 欧美激情视频在线免费观看 欧美视频免费一 | 欧美精品在线观看91| 久久九九免费视频| 国产精品久久久久久久久久久久午夜片| 精品国产一区二区三区久久久| 久久黄色片视频| 国产成人精品av在线| 久久99导航| 久久精品视频99| 国产精品久久久久aaaa九色| 国产精品三级网站| 欧美理论电影在线观看| 一区二区三区av| 欧美一乱一性一交一视频| 无码人妻aⅴ一区二区三区日本| 色一情一乱一伦一区二区三区丨| 日日摸日日碰夜夜爽av| 日本10禁啪啪无遮挡免费一区二区 | 成人做爽爽免费视频| av动漫免费看| 97久久天天综合色天天综合色hd| 久久一区二区三区欧美亚洲| 波霸ol色综合久久| 精品自拍视频在线观看| 日韩有码免费视频| 狠狠色综合一区二区| 国产乱人伦真实精品视频| 91精品黄色| 国产精品偷伦免费视频观看的| 久久国产精品亚洲| 视频一区在线免费观看| 免费99视频| 久久久免费电影| 国产精品美女久久久久av福利| 欧美精品一区三区| 川上优av一区二区线观看| 欧美亚洲一区在线| 国产麻豆乱码精品一区二区三区| 91精品国产91久久久久麻豆 主演 91精品国产91久久久久青草 | 成人久久18免费网站漫画| 久久精品女人的天堂av| 国产精品欧美激情在线观看| 一区二区欧美日韩| 欧美 日韩 国产精品| 国产精品一区视频网站| 日韩中文娱乐网| 欧美日韩爱爱视频| 热99这里只有精品| www国产免费| 久久精品视频中文字幕| 亚洲熟女乱色一区二区三区| 欧美日本亚洲| 国产精品99久久久久久www | 国产淫片免费看| 久久波多野结衣| 亚洲一卡二卡区| 欧美自拍视频在线观看| 成人福利网站在线观看| 国产精品日韩专区| 日本精品一区二区三区不卡无字幕| 国产一区二区三区乱码| 久久激情五月丁香伊人| 亚洲va男人天堂| 国产视频精品网| 日韩在线视频观看| 日韩av黄色网址| 99国产盗摄| 欧美激情第1页| 黄色高清无遮挡| 日韩中文字幕亚洲| 日韩亚洲欧美精品| 99在线首页视频| 美女福利视频一区| 蜜臀久久99精品久久久酒店新书| 日韩一区二区精品视频| 欧美一级视频一区二区| 91九色在线观看视频| 一区二区三区视频| 国产女同一区二区| 萌白酱国产一区二区| 国产特级黄色大片| 精品国产一区二区三区日日嗨| 激情小说综合区| 国产精品视频区| 欧美在线观看网址综合| 精品国产一区二区三区久久久| 欧美一区二区三区成人久久片| 91国自产精品中文字幕亚洲| 亚洲va韩国va欧美va精四季| 国产裸体舞一区二区三区| 不卡av电影院| 国产日韩欧美日韩大片| 一区视频二区视频| 国产精品羞羞答答| 亚洲欧美综合一区| 91久久大香伊蕉在人线| 亚洲 自拍 另类小说综合图区| 不卡视频一区二区| 亚洲高清乱码| 久久人人九九| 欧美专区在线播放| y97精品国产97久久久久久| 妓院一钑片免看黄大片| 精品国产免费一区二区三区| 国产日产欧美a一级在线| 亚洲最大成人在线| 久久久久久a亚洲欧洲aⅴ| 日本电影亚洲天堂| 久久精品在线播放| 国产欧美精品在线| 亚洲第一在线综合在线| 久久精品人成| 国内精品久久久久久久久| 欧美激情视频在线观看| 久久资源av| 欧美日韩国产不卡在线看| 国产精品动漫网站| 国产精选久久久久久| 少妇久久久久久被弄到高潮| 久久国产精品99久久久久久丝袜| 欧美无砖专区免费| 一区二区三区四区免费观看| 久久精品香蕉视频| 欧美精品123| 亚洲中文字幕无码一区二区三区 | 性色av一区二区三区| 久久久久久综合网天天| 狠狠色综合一区二区| 久久夜色精品国产| 97成人精品视频在线观看| 青青草原一区二区| 一区二区视频在线播放| 久久精品国产精品亚洲色婷婷| 国产一区二区三区四区五区在线 | 国产欧美综合一区| 日韩av色在线| www.xxxx精品| av免费观看久久| 欧美中文字幕第一页| 国产99久久精品一区二区| 国产高清不卡无码视频| 日本不卡一区二区三区四区 | 国产精品678| 精品少妇一区二区三区在线| 日韩一级免费看| 欧美激情精品久久久久久| 日韩有码在线电影| 97免费视频在线播放| 蜜桃麻豆www久久国产精品| 丁香六月激情网| 欧美成人全部免费| 久久久久资源| 97精品在线视频| 国产一区二区三区播放| 欧美视频在线观看视频| 青青a在线精品免费观看| 亚洲精品欧美一区二区三区| 欧美成人精品一区| 精品国产一区二区三区久久久| 91免费福利视频| 国产欧美精品一区二区三区| 精品欧美一区二区久久久伦| 日韩美女免费观看| 日韩av播放器| 亚洲成人午夜在线| 久久99久久99精品免观看粉嫩| 日韩在线免费视频观看| 91国自产精品中文字幕亚洲| 国产免费一区二区三区视频| 黄色www在线观看| 欧美日韩亚洲一区二区三区四区| 日韩免费在线视频| 欧美一区二区三区电影在线观看| 亚洲最新在线| 中文字幕一区二区三区乱码| 久久99国产精品久久久久久久久| 国产精品久久久久一区二区 | 国产成人久久久精品一区| 国产视频99| 国产一级不卡毛片| 欧美乱大交xxxxx潮喷l头像| 日韩小视频在线播放| 日韩av高清不卡| 日韩成人在线资源| 日本a在线免费观看| 日本一区二区三区四区在线观看| 午夜精品理论片| 色综合666| 亚洲免费视频播放| 亚洲一区亚洲二区亚洲三区| 中文字幕色一区二区| 亚洲在线一区二区| 亚洲黄色网址在线观看| 天天人人精品| 奇米四色中文综合久久| 日本高清视频一区| 欧美亚洲第一页| 精品一区二区不卡|