- 1.Noga Alon, Shai Ben-David, Nicol6 Cesa-Bianchi and David Haussler, "Scale-sensitive Dimensions, Uniform Convergence, and Learnability," Journal of the ACM 44(4), (1997), 615-631.]] Google ScholarDigital Library
- 2.Peter L. Bartlett, "The Sample Complexity of Pattern Classification with Neural Networks: The Size of the Weights is More Important than the Size of the Network," IEEE Trans. Inf. Theory, 44(2), (1998), 525- 536.]]Google ScholarDigital Library
- 3.Peter Bartlett and John Shawe-Taylor, Generalization- Performance of Support Vector Machines and Other Pattern Classifiers, In 'Advances in Kernel Methods - Support Vector Learning', Bernhard SchiSlkopf, Christopher J. C. Burges, and Alexander J. Smola (eds.), MIT Press, Cambridge, USA, 1998.]] Google ScholarDigital Library
- 4.C. Cortes and V. Vapnik, Support-Vector Networks, Machine Learning, 20(3), (1995) 273-297.]] Google ScholarDigital Library
- 5.Nello Cristianini, John Shawe-Taylor, and Peter Sykacek, Bayesian Classifiers are Large Margin Hyperplanes in a Hilbert Space, in Shavlik, J., ed., Machine Learning: Proceedings of the Fifteenth International Conference, Morgan Kaufmann Publishers, San Francisco, CA.]] Google ScholarDigital Library
- 6.R.O. Duda and P.E. Hart, Pattern Classification and Scene Analysis, New York: Wiley, 1973.]] Google ScholarDigital Library
- 7.Yoav Freund and Robert E. Schapire, Large Margin Classification Using the Perceptron Algorithm, Proceedings of the Eleventh Annual Conference on Computational Learning Theory, 1998.]] Google ScholarDigital Library
- 8.Leonid Gurvits, A note on a scale-sensitive dimension of linear bounded functionals in Banach spaces. In Proceedings of Algorithm Learning Theory, ALT-97, and as NECI Technical Report, 1997.]] Google ScholarDigital Library
- 9.Norbert Klasner and Hans Ulrich Simon, From Noise- Free to Noise-Tolerant and from On-line to Batch Learning, Proceedings of the Eighth Annual Conference on Computational Learning Theory, COLT'95, 1995, pp. 250-257.]] Google ScholarDigital Library
- 10.B. D. Ripley, Pattern Recognition and Neural Networks, Cambridge: Cambridge University Press, 1996.]] Google ScholarDigital Library
- 11.R. Schapire, Y. Freund, P. Bartlett, W. Sun Lee, Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods. In D.H. Fisher, Jr., editor, Proceedings of International Conference on Machine Learning, ICML'97, pages 322-330, Nashville, Tennessee, July 1997. Morgan Kaufmann Publishers.]] Google ScholarDigital Library
- 12.John Shawe-Taylor, Peter L. Bartlett, Robert C. Williamson, Martin Anthony, Structural Risk Minimization over Data-Dependent Hierarchies, IEEE Trans. on Inf. Theory, 44 (5), (1998), 1926-1940.]]Google ScholarDigital Library
- 13.John Shawe-Taylor and Nello Cristianini, Margin Distribution Bounds on Generalization, Proceedings of EuroCOLT'99, Lecture notes in Artificial Intelligence, 1572, 1999, pp. 263-273.]] Google ScholarDigital Library
- 14.John Shawe-Taylor and Robert C. Williamson, Generalization Performance of Classifiers in Terms of Observed Covering Numbers, Proceedings of Euro- COLT'99, Lecture notes in Artificial Intelligence, 1572, 1999, pp. 274-284.]] Google ScholarDigital Library
Index Terms
- Further results on the margin distribution
Recommendations
SRAM read/write margin enhancements using FinFETs
Process-induced variations and subthreshold leakage in bulk-Si technology limit the scaling of SRAM into sub-32 nm nodes. New device architectures are being considered to improve VT control and reduce short channel effects. Among the likely candidates, ...
Ultra-Low power sub-threshold SRAM cell design to improve read static noise margin
VDAT'12: Proceedings of the 16th international conference on Progress in VLSI Design and TestSub-threshold circuit design is a prevalent selection for ultra-low power (ULP) systems. Static random access memory (SRAM) is an important component in these systems therefore ultra-low power SRAM has become popular. Operation of standard 6T SRAM at ...
Sensing margin trend with technology scaling in MRAM
Magnetoresistive random access memory (MRAM) is a leading candidate for future memory applications because it may provide compelling advantages by combining desirable attributes of SRAM, DRAM, and Flash. Process technology has recently scaled down to ...
Comments