PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Future directions in Multiple Instance Learning

Autorzy
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
In Multiple Instance Learning, each training sample consists of a set of unlabelled instances. The set as a whole is labeled positive if at least one instance in the set is positive, or negative otherwise. Given such training samples, the goal is to learn either an explicit description of the common positive instance(s) or a bag classifier that can assign labels to bags. Previous research has focused on this standard definition of the problem where instances in a set are independent. This raises a question: if we remove the independence assumption, can we generalize the goal of finding a description of the common instance(s) to that of finding a description of the common pattern(s) among instances? Similarly, can we generate bag classifiers that discriminate based on common pattern(s) among instances instead of just common instance(s)? This question raises many other related questions that have not been yet fully explored in the context of this problem. In this paper we first present a survey of existing methods that work with the standard definition of the problem and then elaborate on the previous question in the hope that researchers will investigate this exciting research direction.
Twórcy
autor
  • School of Computer Science, University of Guelph, Guelph, Canada
autor
  • St. Francis Xavier University, Antigonish, Canada
autor
  • School of Computer Science, University of Guelph, Guelph, Canada
Bibliografia
  • [1] Dietterich, T. G., Lathrop, R. H., Perez, T. L.: Solving the multiple instance problem with axis-parallel rectangles. Artificial Intelligence, 89(1-2), pp. 31–71, 1997. ISSN 0004-3702.
  • [2] Teramoto, R., Kashima, H.: Prediction of protein-ligand binding affinities using multiple instance learning. Journal of Molecular Graphics and Modelling, 29(3), pp. 492–497, 2010.
  • [3] Maron, O., Perez, T. L.: A framework for multiple-instance learning. In: Proceedings of the Advances in Neural Information Processing Systems, pp. 570–576. 1998.
  • [4] Zhang, Q., Goldman, S. A.: EM-DD: An Improved Multiple-Instance Learning Technique. In: Advances in Neural Information Processing Systems, volume 14, pp. 1073–1080. MIT Press, 2001.
  • [5] Wang, C., Zhang, L., Zhang, H.-J.: Graph-based multiple-instance learning for object-based image retrieval. In: Proceedings of the 1st ACM international conference on Multimedia information retrieval, MIR ’08, pp. 156–163. ACM, New York, NY, USA, 2008. ISBN 978-1-60558-312-9.
  • [6] Li, F., Liu, R.: Multi-graph multi-instance learning for object-based image and video retrieval. In: Proceedings of the 2nd ACM International Conference on Multimedia Retrieval, ICMR ’12, pp. 35:1–35:8. ACM, New York, NY, USA, 2012. ISBN 978-1-4503-1329-2.
  • [7] Rahmani, R., Goldman, S. A., Zhang, H., Krettek, J., Fritts, J. E.: Localized content based image retrieval. In: Proceedings of the 7th ACM SIGMM international workshop on Multimedia information retrieval, MIR ’05, pp. 227–236. ACM, New York, NY, USA, 2005. ISBN 1-59593-244-5.
  • [8] Gondra, I., Xu, T.: A multiple instance learning based framework for semantic image segmentation. Multimedia Tools Appl., 48(2), pp. 339–365, 2010. ISSN 1380-7501.
  • [9] Qi, Z., Xu, Y., Wang, L., Song, Y.: Online multiple instance boosting for object detection. Neurocomput., 74(10), pp. 1769–1775, 2011. ISSN 0925-2312.
  • [10] Dollár, P., Babenko, B., Belongie, S., Perona, P., Tu, Z.: Multiple Component Learning for Object Detection. In: Proceedings of the 10th European Conference on Computer Vision: Part II, ECCV ’08, pp. 211–224. Springer-Verlag, Berlin, Heidelberg, 2008. ISBN 978-3-540-88685-3.
  • [11] Viola, P., Platt, J. C., Zhang, C.: Multiple Instance Boosting for Object Detection. In: Advances in Neural Information Processing Systems 18, pp. 1417–1426. 2007.
  • [12] Babenko, B., Yang, M.-H., Belongie, S.: Robust Object Tracking with Online Multiple Instance Learning. IEEE Trans. Pattern Anal. Mach. Intell., 33(8), pp. 1619–1632, 2011. ISSN 0162-8828.
  • [13] Zhou, Z.-H., Jiang, K., Li, M.: Multi-Instance Learning Based Web Mining. Applied Intelligence, 22(2), pp. 135–147, 2005. ISSN 0924-669X.
  • [14] Andrews, S., Tsochantaridis, I., Hofmann, T.: Support vector machines for multiple-instance learning. In: Advances in Neural Information Processing Systems, volume 15, pp. 561–568. 2003.
  • [15] Zhou, X., Ruan, J., Zhang, W.: Promoter prediction based on a multiple instance learning scheme. In: Proceedings of the First ACM International Conference on Bioinformatics and Computational Biology, BCB ’10, pp. 295–301. ACM, New York, NY, USA, 2010. ISBN 978-1-4503-0438-2.
  • [16] Mei, S., Fei, W.: Structural Domain Based Multiple Instance Learning for Predicting Gram-Positive Bacterial Protein Subcellular Localization. In: IJCBS’09, pp. 195–200. 2009.
  • [17] Long, P. M., Tan, L.: PAC learning axis-aligned rectangles with respect to product distributions from multiple-instance examples. In: Proceedings of the 9th Annual Conference on Computational learning Theory, pp. 228–234. 1996. ISBN 0-89791-811-8.
  • [18] Blum, A., Kalai, A.: A Note on Learning from Multiple-Instance Examples. Machine Learning, 30(1), pp. 23–29, 1998. ISSN 0885-6125.
  • [19] Auer, P.: On learning from multi-instance examples: empirical evaluation of a theoretical approach. In: Proceedings of the 4th International Conference on Machine Learning, pp. 21–29. 1997. ISBN 1-55860-486-3.
  • [20] Xu, T., Chiu, D., Gondra, I.: Constructing target concept in multiple instance learning using maximum partial entropy. In: Proceedings of the 8th international conference on Machine Learning and Data Mining in Pattern Recognition, MLDM’12, pp. 169–182. Springer-Verlag, Berlin, Heidelberg, 2012. ISBN 978-3-642-31536-7.
  • [21] Raykar, V. C., Krishnapuram, B., Bi, J., Dundar, M., Rao, R. B.: Bayesian multiple instance learning: automatic feature selection and inductive transfer. In: Proceedings of the 25th international conference on Machine learning, ICML ’08, pp. 808–815. ACM, New York, NY, USA, 2008. ISBN 978-1-60558-205-4.
  • [22] Ray, S., Craven, M.: Supervised versus multiple instance learning: an empirical comparison. In: Proceedings of the 22nd international conference on Machine learning, ICML ’05, pp. 697–704. ACM, New York, NY, USA, 2005. ISBN 1-59593-180-5.
  • [23] Xu, X., Frank, E.: Logistic Regression and Boosting for Labeled Bags of Instances. In: Proc. of the PacificAsia Conf. on Knowledge Discovery and Data Mining, pp. 272–281. Springer-Verlag, 2004.
  • [24] Wang, J., Zucker, J. D.: Solving the Multiple-Instance Problem: A Lazy Learning Approach. In: Proceedings of the 17th International Conference on Machine Learning, pp. 1119–1126. 2000. ISBN 1-55860-707-2.
  • [25] Boser, B. E., Guyon, I. M., Vapnik, V. N.: A training algorithm for optimal margin classifiers. In: Proceedings of the fifth annual workshop on Computational learning theory, COLT ’92, pp. 144–152. ACM, New York, NY, USA, 1992. ISBN 0-89791-497-X.
  • [26] Gärtner, T., Flach, P. A., Kowalczyk, A., Smola, A. J.: Multi-Instance Kernels. In: In Proc. 19th International Conf. on Machine Learning, pp. 179–186. Morgan Kaufmann, 2002.
  • [27] Chevaleyre, Y., Zucker, J. D.: Solving Multiple-Instance and Multiple-Part Learning Problems with Decision Trees and Rule Sets. Application to the Mutagenesis Problem. In: Proceedings of the 14th Biennial Conference of the Canadian Society on Computational Studies of Intelligence. pp. 204–214. 2001.
  • [28] Ruffo, G.: Learning Single and Multiple Instance Decision Trees for Computer Security Applications. Ph.D. thesis, Universita di Torino, Italy, 2001.
  • [29] Leistner, C., Saffari, A., Bischof, H.: MIForests: multiple-instance learning with randomized trees. In: Proceedings of the 11th European conference on Computer vision: Part VI, ECCV’10, pp. 29–42. Springer-Verlag, Berlin, Heidelberg, 2010. ISBN 3-642-15566-9, 978-3-642-15566-6.
  • [30] Zhang, M. L., Zhou, Z. H.: Adapting RBF Neural Networks to Multi-Instance Learning. Neural Process. Lett., 23(1), pp. 1–26, 2006.
  • [31] Schapire, R. E.: The Strength of Weak Learnability. Mach. Learn., 5(2), pp. 197–227, 1990. ISSN 0885-6125.
  • [32] Mason, L., Baxter, J., Bartlett, P., Frean, M.: Boosting Algorithms as Gradient Descent. In: In Advances in Neural Information Processing Systems 12, pp. 512–518. MIT Press, 2000.
  • [33] Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: a Statistical View of Boosting. The Annals of Statistics, 38(2), 2000.
  • [34] Hajimirsadeghi, H., Mori, G.: Multiple Instance Real Boosting with Aggregation Functions. In: International Conference on Pattern Recognition, ICPR. 2012.
  • [35] Zhang, M.-L., Zhou, Z.-H.: M3MIML: A Maximum Margin Method for Multi-instance Multi-label Learning. In: Proceedings of the 2008 Eighth IEEE International Conference on Data Mining, ICDM ’08, pp. 688–697. IEEE Computer Society, Washington, DC, USA, 2008. ISBN 978-0-7695-3502-9.
  • [36] Xu, X.-S., Xue, X., Zhou, Z.-H.: Ensemble multi-instance multi-label learning approach for video annotation task. In: Proceedings of the 19th ACM international conference on Multimedia, MM ’11, pp. 1153–1156. ACM, New York, NY, USA, 2011. ISBN 978-1-4503-0616-4.
  • [37] Li, Y.-X., Ji, S., Kumar, S., Ye, J., Zhou, Z.-H.: Drosophila Gene Expression Pattern Annotation through Multi-Instance Multi-Label Learning. IEEE/ACM Trans. Comput. Biol. Bioinformatics, 9(1), pp. 98–112, 2012. ISSN 1545-5963.
  • [38] Yakhnenko, O., Honavar, V.: Multi-Instance Multi-Label Learning for Image Classification with Large Vocabularies. In: Procedings of the British Machine Vision Conference 2011, pp. 59.1–59.12. British Machine Vision Association, 2011. ISBN 1-901725-43-X.
  • [39] Zhou, Z.-H., Zhang, M.-L.: Multi-Instance Multi-Label Learning with Application to Scene Classification. In: NIPS’06, pp. 1609–1616. 2006.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-264e0af1-9656-40ec-b4d7-969a625c5edc
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.