Skip to content

Conversation

@dreadlord1984
Copy link

No description provided.

… list; Add dimensions to graph output; Add detailed table of network layers; Add summarized table
fix summarization bug (detailed table modified)
recognizes data>transform_param>crop_size as input-size param
moved calculated parameters/dimensions to node.analysis instead of node.dim
added LRN, INNERPRODUCT, BATCHNORM, FLATTEN layers
Convolution parameters separate for W, H (stride, size, ...)
Comp.Complexity / Memory Req. still missing!
highlight color for LRN nodes
update javascript (fully-connected output dimensions)
add computational complexity + memory requirements
add TOTAL row in summary table
pretty-print large numbers
… gh-pages

# Conflicts:
#	quickstart.html
add dynamic page title
join batchnorm/relu to inline in inceptionv3
Add FCN-16s prototxt (fixed up for net. Need name=top in names!)
fix FCN-16s net (false implicit layers removed)
@dreadlord1984
Copy link
Author

update

@rlnx
Copy link
Contributor

rlnx commented Jan 14, 2018

Hi @dgschwend, could you please provide any description for your changes?

@dgschwend
Copy link

Hi @RuslanIsrafilov! My fork contains code for CNN network analysis (computational complexity, memory requirements for each layer). However, I did not open this pull request — I think there is no need to pull any changes back to the original netscope repo. @dreadlord1984, what‘s your intention with this PR?

sovrasov and others added 4 commits January 19, 2018 15:12
* Add prelu and elu support

* Recompile coffe files
* Add Input data layer

* Recompile coffee scripts
* add support for relu6

* fix relu6
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants