最美情侣中文字幕电影,在线麻豆精品传媒,在线网站高清黄,久久黄色视频

歡迎光臨散文網(wǎng) 會(huì)員登陸 & 注冊(cè)

吳恩達(dá)DeepLearning.ai之生成對(duì)抗網(wǎng)絡(luò)(GANS)專業(yè)化〔Andre

2023-02-22 22:28 作者:聽聽我的腦洞  | 我要投稿

Generative models

  • Variational Autoencoders
  • Encoder-Latent Space-Decoder
  • Generative Adversarial Networks
  • Generator: learn to produce realistic examples
  • Discriminator: distinguish between fake and real

?
1.6discriminator P6 - 00:19
?

Discriminator

basically, a neural networks classifier

compare the Y^hat with Label Y.

P(Y|X)



?
1.7generator P7 - 00:01
?

Generator

Use noises, put noises into generator, and get X^\hat, then put X^\hat into discriminator, to Y^\hat_d.

Use the Y^\hat the difference between... to update the parameters of generator.

save the parameters





P(X|Y), probability of features given class Y.

Since, we only care about one specific label each time, so it is

P(X|Y)=P(X) here. we can ignore the Y here.

?
1.8bce-cost-function P8 - 00:02
?

BCE cost function.

BCE stands for Binary Cross Entropy Loss equation by parts.

summary over the entire batch, where the summation from i=1 to m indicates


h means the prediction?

y is the label,

theta is the parameter,

x is the features.



?
1.8bce-cost-function P8 - 01:54
?


first item:

if the true y is fake, the value of y is 0.

Then, no matter what the prediction is, the first item is 0.

If the true y is real, then if the prediction has a high probability say 0.99 to be real, the value of the first item is going to be 0.

However, if the prediction is close to 0, then the product of first item is going to be negative infinity.

Hence, negative infinity here indicates bad result.

if the prediction is good,

if the prediction is bad, it goes to -\infinity

second item:


If the prediction is really bad, the value goes to negative infinity.

similarly, negative infinity indicates bad prediction.



?
1.9putting-it-all-together P9 - 00:15
?


for discriminator, pass X^\hat and real X into the discriminator, then BCE.

update the \theta_d (parameter for the discriminator)

want to know the difference between fake and real

generator want the fake things to be as real as possible.



?
2.2activations-basic-properties P12 - 00:07
?

activations

non-linear differential


?
2.3common-activation-functions P13 - 00:25
?


ReLu -dying ReLU when it is negative, it is always 0. loss the information.

Leaky ReLU solves the problem

max(az,z)

a = 0.1

so it is not compare to 0, but a small value.

Sigmoid/Tanh-- vanish gradient and saturation problem



?
2.4batch-normalization-explained P14 - 04:11
?

batch normalization reduce the the covariate shift.

easier to train and speed the training process.

?
2.5batch-normalization-procedure P15 - 02:43
?

norm for training

norm for test fixed values.


?
3.2mode-collapse P21 - 00:40
?




10 modes for numbers


it will converge to 1 mode..that's the problem.



?
3.3problem-with-bce-loss P22 - 03:06
?

vanishing gradients


?
3.4earth-movers-distance P23 - 01:11
?



?
3.5wasserstein-loss P24 - 00:03
?





?
3.6condition-on-wasserstein-critic P25 - 00:13
?




?
3.7 1-lipschitz-continuity-enforcemen P26 - 00:19
?






?
4.2conditional-generation-intuition P28 - 02:05
?



?
4.3conditional-generation-inputs P29 - 02:02
?



?
4.4controllable-generation P30 - 00:19
?

controllable generation control some of the features ...


?
4.5vector-algebra-in-the-z-space P31 - 00:49
?




?
4.6challenges-with-controllable-gener P32 - 01:19
?



?
4.7classifier-gradients P33 - 01:05
?


take advantage of pre-trained classifier.

?
4.8disentanglement P34 - 02:24
?


?
4.8disentanglement P34 - 04:22
?



吳恩達(dá)DeepLearning.ai之生成對(duì)抗網(wǎng)絡(luò)(GANS)專業(yè)化〔Andre的評(píng)論 (共 條)

分享到微博請(qǐng)遵守國(guó)家法律
溧水县| 名山县| 孟连| 灵山县| 金塔县| 广南县| 灵川县| 鹿泉市| 昭苏县| 天长市| 凤山市| 鹤壁市| 平邑县| 米脂县| 凌源市| 上虞市| 庆元县| 呼伦贝尔市| 靖宇县| 仁寿县| 昆明市| 郁南县| 仙桃市| 平度市| 漳平市| 炎陵县| 开封市| 庆元县| 苏州市| 门源| 琼结县| 田林县| 镇平县| 三河市| 东兰县| 武穴市| 土默特左旗| 阿鲁科尔沁旗| 罗田县| 淮南市| 英吉沙县|