WEBVTT 1 00:00:03.600 --> 00:00:04.330 Kari Weaver: Okay. 2 00:00:07.770 --> 00:00:30.199 Kari Weaver: I'll share that. The well, I don't think anybody else came in. But we'll look at that shared document before we get into that, though I want to introduce the team members that have been part of this year long. Aacnu, which stands for American Association of Colleges and Universities Institute on AI curriculum and pedagogy. 3 00:00:30.200 --> 00:00:39.760 Kari Weaver: So I'm Kari Weaver. I teach in Liberal arts, and I'm the director of the Jane B. Nord Center for teaching and learning. Susie, can you go next. 4 00:00:42.220 --> 00:00:47.119 Suzanne McGinness (she/her): Hi! I'm Susie Mcginnis. I'm a assistant professor with the Illustration Department. 5 00:00:51.390 --> 00:01:00.400 Jason Tilk: I'll I'll pipe up, Jason, till Professor of practice in industrial design as well as liberal arts. 6 00:01:02.580 --> 00:01:06.710 sligon: And scott Lagan, Associate Professor in foundation. 7 00:01:07.840 --> 00:01:12.050 Jimmy Kuehnle: Jimmy, keenly professor in sculpture, expanding media and foundation. 8 00:01:13.940 --> 00:01:30.150 Kari Weaver: Great. Thank you. So the Institute itself. There's a lot of events that go along with it. We were able to be part of this institute through some additional grant funding I got from the Nord family. 9 00:01:30.330 --> 00:01:31.380 Kari Weaver: So that's 10 00:01:31.490 --> 00:01:43.809 Kari Weaver: where that funding has come from, at least for the year. We've had a lot of sessions online a 2 day. Kickoff and sessions of choice 11 00:01:43.840 --> 00:02:09.110 Kari Weaver: related, you know, to ethics, teaching and learning, etc, great great kickoff session with a Futurist talking about AI. We've had. We just had a midyear event that included a student panel, and we'll share a little bit from that student panel. We have a mentor as part of this, and we've been able to have conversations with people at a lot of different institutions. 12 00:02:09.110 --> 00:02:31.139 Kari Weaver: But we're the only art and design institution. At the Institute. There is another college or department of art there, but and that that has been representative of our experience, where not a lot of people are talking about art and design. So that's been an interesting aspect of this. 13 00:02:31.650 --> 00:02:38.210 Kari Weaver: To give you a little more information about this, I'm going to share my screen. In case 14 00:02:38.620 --> 00:02:50.370 Kari Weaver: you don't have it open, and maybe you're already looking at yours. So I wanted to put the Institute purpose and design in in this document. 15 00:02:50.380 --> 00:03:13.730 Kari Weaver: And this is the language directly from the website. If you're interested and in here they're they're talking a lot about a skills gap in terms of employment. And we haven't been talking so much in the Institute about that skills gap. But it has been an often repeated concern. 16 00:03:14.204 --> 00:03:37.689 Kari Weaver: The goals down here, though, are much more of what this is focused on and what our work in the committee is focused on. So looking to rethink, pedagogical, and assessment approaches within and across courses, address academic integrity, concerns and consider new policies and practices, contemplate the ethical and equity implications of AI, 17 00:03:37.840 --> 00:03:44.920 Kari Weaver: and adopt AI competencies and literacies as course and or programmatic learning outcomes. 18 00:03:45.420 --> 00:03:55.730 Kari Weaver: And what is clear from the the leading paragraph before those bullets and the conversations I've been in 19 00:03:55.910 --> 00:04:07.379 Kari Weaver: is because AI is moving so fast. Right? We we it just is is rapidly changing. And it's changing, you know, teaching, learning, working. 20 00:04:07.630 --> 00:04:25.280 Kari Weaver: And so we have 2 main entities who are really teaching people about AI use about socializing them to to AI. And that's industry and corporations and educational institutions. Right? Very different end goals for those 2 groups. 21 00:04:25.630 --> 00:04:42.099 Kari Weaver: Right? The end goals of industry and corporations is maximizing profit, shareholder wealth, efficiency, and innovation. And certainly we share that last goal, especially of of wanting to have, you know. 22 00:04:42.510 --> 00:04:44.619 Kari Weaver: promote innovation. But 23 00:04:44.720 --> 00:05:04.890 Kari Weaver: the goals of higher Ed are very different. Right? We want to generate knowledge. We want to serve a public good, promote civic outcomes, and also, you know, enhance our students individual growth and agency. And when I think about which goals, I feel, are more important 24 00:05:04.890 --> 00:05:23.350 Kari Weaver: in shaping how we teach AI right, shaping the norms in relationship to it. I'd much rather it align with the goals of higher education, which is why I feel it's so important that we invest a lot into thinking about how we're doing this. And we we make moves on that. 25 00:05:23.720 --> 00:05:43.689 Kari Weaver: and that is a good transition to our 1st question. I did put our panel questions in this document. I'm going to stop my share, though, but for my panelists. I want to know why you were interested in the Institute, and I'll start off with Jimmy. 26 00:05:48.970 --> 00:06:02.450 Jimmy Kuehnle: On mute. I was interested in the Institute because we can always learn more things from other folks, and, as Kari said, AI is moving so fast, and it seems like a great thing to be involved in, and was super interested. 27 00:06:09.160 --> 00:06:35.300 Suzanne McGinness (she/her): I can go next. Coming from illustration. I'm very interested in the ethical uses of AI, and I'm very new to AI as well. So I'm interested in how other institutions are, you know, coming up with ethical policies and uses of the technology. And yeah, just learning as much as I can about all of this. 28 00:06:35.740 --> 00:06:36.550 Suzanne McGinness (she/her): Thanks. 29 00:06:38.530 --> 00:06:55.241 Jason Tilk: Yeah, Jason, I think, similar to Susie in the sense of like trying to understand it. Because there's so much going on, and the buzzwords, and I had had some experience on it. Playing around with some of the generative tools that are out there. 30 00:06:55.810 --> 00:07:17.342 Jason Tilk: and so it was like morbid curiosity meets also the changing times of industry. There's a you know, in in the industrial design field. There's, you know, we get a lot of the get a lot of companies that come in and sponsor projects. And they're like, Use AI, you know, the the companies are hot on it so 31 00:07:17.690 --> 00:07:35.006 Jason Tilk: to understand number one how to use it and why they're asking to use it. And then, of course, you know, I'm gonna back, Susie, up on that ethical question like, well, should our students really be using it? that really come comes to light? So yeah, 32 00:07:35.900 --> 00:07:42.659 Jason Tilk: learn absorb and synthesize is is the biggest reason that I joined the team. 33 00:07:45.490 --> 00:08:05.670 sligon: And I'll say the sheer, gigantic inevitability of AI, just the fact that it's happening and that our students are going to be needing to be prepared and expected to be prepared to use AI. And if we can guide both ethics and 34 00:08:06.260 --> 00:08:28.799 sligon: things that facilitate growth as an artist and as opposed to things that take away from that. Then I think ethics is intertwined with usefulness. I don't think you can separate them, but I wanted to obviously be foundations representative 35 00:08:28.990 --> 00:08:32.130 sligon: in this, and and just learn as much as I can about it. 36 00:08:34.360 --> 00:08:59.949 Kari Weaver: Great thanks, everyone and I, you know that common thread of real concern for our students and our creative professions is so important, and that student voice is really crucial, right? We we are working to get more of that student voice in this, and it was nice to have the student panel at the Institute recently. So I'm going to share about a 37 00:09:00.210 --> 00:09:11.380 Kari Weaver: 3 and a half minute video clip of that student panel with Aacnu, where they're asking students about AI literacy. So 38 00:09:12.930 --> 00:09:17.390 Kari Weaver: I tried this before. And it worked. And hopefully, it will work. 39 00:09:19.650 --> 00:09:25.190 Kari Weaver: Exactly. That's a Tessa. Since you're in Ed tech. I'm gonna pitch this question to you. 40 00:09:25.850 --> 00:09:29.939 Kari Weaver: what is your understanding of what it means to be AI Literate? 41 00:09:31.230 --> 00:09:46.530 Kari Weaver: Yeah, absolutely. So stuff that I've learned in classes about AI literacy, because obviously this is such a hot topic. So all of my professors are talking about this consistently, I think, in terms of AI literacy. It's understanding when it's appropriate to use AI and understanding 42 00:09:46.750 --> 00:10:02.820 Kari Weaver: what information you are getting and how accurate set information is. So, for example, students who are not very AI Literate or don't have as great literacy are the students who are using chatgpt to write essays for them and turning that in. 43 00:10:03.010 --> 00:10:11.659 Kari Weaver: That is not what AI is. I mean, it's a capability of AI. But that's not what it's meant to be used for, or should be used for in education 44 00:10:11.770 --> 00:10:22.490 Kari Weaver: instead, people who are more, I guess AI Literate are able to understand. Okay, the computer is not always correct. What AI is generating for me might not always have 45 00:10:22.590 --> 00:10:25.339 Kari Weaver: the correct information. So you have to double check that. 46 00:10:25.740 --> 00:10:33.409 Kari Weaver: as I mentioned earlier, it's supposed to be a tool that is used to supplement light supplement points or 47 00:10:33.570 --> 00:10:41.569 Kari Weaver: things that you do in your life in terms of education, you know, maybe helping you to get like discussion, question ideas or helping you like. 48 00:10:41.750 --> 00:10:44.340 Kari Weaver: get us started like essay topics and stuff like that. 49 00:10:44.560 --> 00:10:47.220 Kari Weaver: So just understanding how AI 50 00:10:47.450 --> 00:10:55.269 Kari Weaver: can be used appropriately, but also understanding the flaws of AI, not seeing it as like a perfect substitution. 51 00:10:56.170 --> 00:11:03.779 Kari Weaver: Does that match everyone else's sort of view of AI literacy? Or do you have an extra piece to throw in there regarding what it means to be AI Literate 52 00:11:03.980 --> 00:11:04.690 Kari Weaver: today. 53 00:11:06.096 --> 00:11:12.649 Kari Weaver: Yeah, I wanna pop on thinking about like, now with the education, you know. 54 00:11:12.800 --> 00:11:20.970 Kari Weaver: she is like, I do agree with her that you know, people who are using like Chat, gpt, or like AI, illiterate. If for folks who are using it to like further like 55 00:11:21.110 --> 00:11:29.690 Kari Weaver: continue and like grow their verbiage with AI, typically, I think that it's something that should be introduced as like a course like the higher education. 56 00:11:29.840 --> 00:11:42.940 Kari Weaver: in my opinion, with that. So you know, everyone gets the basics, you know, like, you know what AI is. You know. What's the ethics? What's the concerns? How to speak to AI, obviously like, you know, took a course at Berkeley. I never thought it might like 57 00:11:43.160 --> 00:11:54.669 Kari Weaver: whole life of a career that I would take a course of how to speak to a robot, and after taking that course it gets me thinking differently versus like, you know. For example, when I use like one of 58 00:11:55.080 --> 00:12:00.259 Kari Weaver: one of my classmates uses AI. They're looking for the robot to do the work for them 59 00:12:00.480 --> 00:12:06.639 Kari Weaver: which comes back to the take an ownership part, and I feel like if you don't have that knowledge of how to speak to AI, 60 00:12:06.770 --> 00:12:08.580 Kari Weaver: you're just gonna end up, like, you know. 61 00:12:08.680 --> 00:12:26.749 Kari Weaver: falling in line, you know, with that. So I feel like you have to learn what it is. First, st you know how to speak to it. No, because regardless like, even if you type in Chat Gpt. Everybody can do it right now. I've like experience. I try it out. If you tell Chat Gbt to write your essay, it's not going to write your essay. It's going to tell you how to write the essay. 62 00:12:26.810 --> 00:12:40.350 Kari Weaver: like, you know, pinpoint everything with that. And I learned that on my own. So instead of using that like against me, I just learned like, Oh, okay. And then, you know, eventually just start building your skills of how to do better, and then eventually, like you may not need it, or you may still need it. 63 00:12:40.510 --> 00:12:44.510 Kari Weaver: So I think it's something that should be required for it. And 64 00:12:44.650 --> 00:12:52.279 Kari Weaver: you know I'm looking about a amazing professor that really broke it down and made it simple and concise through my semester. So now I am a expert of 65 00:12:52.410 --> 00:12:55.669 Kari Weaver: talking to any AI generated. Okay. 66 00:12:57.504 --> 00:13:00.060 Kari Weaver: are you referring to Jason? Is he the professor? 67 00:13:00.598 --> 00:13:19.410 Kari Weaver: Yes, yes, Mr. Gula himself. He makes the class fun and I learned so much for it especially where I work at a company right now that you know we just they just launched their own AI! So you know, on my mobile devices they summarize notifications for me and everything with that. So I'm 68 00:13:19.850 --> 00:13:23.919 Kari Weaver: like everywhere is AI about me. And taking this course really put me 69 00:13:24.280 --> 00:13:49.440 Kari Weaver: above not even just an education, and my daily work life, too. So if you guys are in the chat listening highly, recommend educating yourself on. You know how to get effective prompts from Chat Gpt. Versus telling the AI to do the work for you, because if it does it for you. You're not taking ownership, and I think that's where it comes. That dilemma of like, you know, plagiarism or cheating you gotta you gotta put in some work to receive some work. 70 00:13:55.030 --> 00:14:10.159 Kari Weaver: I thought that was really interesting to hear. Kind of those experiences of, you know, they both in education and in work, and certainly we don't have the same kind of. I was just talking to the Student Leadership Council, and I'm like. 71 00:14:10.160 --> 00:14:35.060 Kari Weaver: you know, what we heard a lot of in this institute is like concerns about plagiarism and writing right? Those aren't our major concerns. And you heard some of that here. But you heard authenticity, right? Ownership. That's what our students want, right? They want to feel like their own voice is important. And that's what they're cultivating. And I really appreciated hearing from that 72 00:14:35.060 --> 00:14:46.580 Kari Weaver: and thinking about somebody who's seen it both in education and the work world. And you know, seeing that it is everywhere, and feeling like they are able to work with it appropriately. 73 00:14:47.515 --> 00:15:02.120 Kari Weaver: I don't have other clips from the student panel, but another one that was impactful is the moderator asked the students, if they felt like there should be a required AI course in college, and it was a resounding yes. 74 00:15:02.320 --> 00:15:05.800 Kari Weaver: right. Even those who had. 75 00:15:05.850 --> 00:15:16.719 Kari Weaver: like a lot of concerns about it, said yes, and a comment a student made really hit home when they're contextualizing. Why, they said, yes. 76 00:15:16.730 --> 00:15:30.420 Kari Weaver: they said that, and I'm paraphrasing. But you know we're we're in a generation that has had to kind of learn on our own what social media is and how to navigate, that we didn't get a lot of leadership. 77 00:15:30.420 --> 00:15:56.699 Kari Weaver: a lot of understanding. And we see people using it in really problematic ways. We see a culture around it that's really problematic. And we need to know this information. We want guidance. We want leadership in this. And I thought that that was so impactful to hear. And I want to see if any other members of the team had anything they wanted to share from that student panel. 78 00:16:02.860 --> 00:16:26.070 Suzanne McGinness (she/her): I think the key aspect that I found to be really important was the AI literacy right? Just understanding the technology, understanding its uses, being aware of it as opposed to, especially in illustration, is sort of like, oh, we're going to avoid it. But it's what I've learned is that, and especially receiving impact or input from the student panel. Is that? 79 00:16:26.070 --> 00:16:38.660 Suzanne McGinness (she/her): No, we're craving kind of understanding its uses. So we can, like the 1st speaker said, use it appropriately, and I thought that was wonderful to hear, and absolutely applies to our students here at CIA. 80 00:16:42.590 --> 00:17:06.329 Kari Weaver: Thank you, Susie, and our team has always said we need the student voice from here, because again, the students you're hearing there. Most of them are not in art and design related areas. So we need to get the student voice from here as well. We've got a student town hall that's scheduled for February 14.th So look out for information on that. 81 00:17:07.020 --> 00:17:23.060 Kari Weaver: and we want our students to understand that we hear and share their concerns, and they can trust us to provide a holistic education that really cares for them in so many ways moving on to the next question. So 82 00:17:23.250 --> 00:17:37.570 Kari Weaver: what have you learned from engaging in this Institute, which has included a variety of talks and learning sessions, cross institutional discussion and mentorship. And this time I'd love to start with Scott. 83 00:17:38.940 --> 00:18:02.660 sligon: Well, I'm going to connect that back to my 1st answer to my 1st question, the sheer inevitability of it. I think one of the statistics that was mentioned in one of the conferences was 70% of jobs that today's freshmen will graduate into are going to require AI in some way, and then going back to the student panel that you showed a clip of. I think that 84 00:18:02.910 --> 00:18:21.859 sligon: all of them were hungry for guidance, and mentioned something to the effect of if you ignore it, it's tacit approval that leaves the students up to to figure out for themselves what tools are available, and what should they do with them? So I think that we need to 85 00:18:22.610 --> 00:18:48.940 sligon: to be shepherds for for that. And you know, I think that there might be a lot of negative, disruptive things happening with AI. But also it's possible it's going to cure cancer, you know. So something that's such a powerful tool should not be ignored. And as educators, as teachers, you know, readying these people for jobs and careers, you know, and and 86 00:18:49.080 --> 00:18:53.787 sligon: all of the creative ethical aspects that can use be used. 87 00:18:54.770 --> 00:18:58.259 sligon: I I just think it's it's a necessity. Yeah. 88 00:19:06.320 --> 00:19:19.190 Suzanne McGinness (she/her): For me. It was great to hear how other institutions are collaborating with the technology, how they're enforcing institutional policies, how they're even just beginning to introduce AI onto their campuses. 89 00:19:19.190 --> 00:19:44.009 Suzanne McGinness (she/her): Some institutions were like West Point, for example, was, you know, already, integrating everything into the curriculum where there was a lot of other institutions and colleges like us that were just beginning and kind of getting to understand this process of how do we start this trickle down effect. And how do we introduce it to the faculty and bring the excitement to the students and create a sense of play and wonder around the technology as well? So. 90 00:19:54.380 --> 00:20:01.854 Jason Tilk: I think, some of the some of the interesting stuff was the the use of it internally? 91 00:20:02.835 --> 00:20:08.679 Jason Tilk: some of the tools that were shared were very much stressing the sort of like. 92 00:20:08.900 --> 00:20:15.067 Jason Tilk: how do you offload stuff? How do you get another point of view using AI for that. 93 00:20:16.620 --> 00:20:44.359 Jason Tilk: Some of it was a little dry and like I was having trouble like applying it and then other times I was inspired by like. Oh, well, this is, you know, this is filling in gaps, for you know where I can have a partner in my development of things, be it even curricularly. Or, you know, teaching students how to integrate integrate tools as a collaborative partner. 94 00:20:53.200 --> 00:20:57.060 Jimmy Kuehnle: I think my biggest takeaway from the student panel was how much 95 00:20:57.300 --> 00:21:05.570 Jimmy Kuehnle: it was ubiquitous and how much the students embraced the inevitability. We don't always 96 00:21:06.010 --> 00:21:34.750 Jimmy Kuehnle: see that with students. And it was, it was refreshing. Granted, these were more graduate students, so I think that they felt that way. But it would be interesting, as we have the Town Hall with our students to see how they're using it. Because I think, both in a writing context and in a visual context, the considerations and ethical considerations are different. The inevitability is the same. But I think it's a conversation that would be interesting to have with our students with that angle. 97 00:21:37.450 --> 00:21:43.469 Kari Weaver: Yeah, I think also, what was important is that the the bulk of the learning, or maybe the the 98 00:21:43.610 --> 00:22:05.279 Kari Weaver: most rich learning didn't come from the Institute Sessions themselves. It came from engagement with others here at CIA. From the from this team I learned so much from others, and having conversations and seeing what others were doing. So again, that importance of having community around this is really so crucial. 99 00:22:06.660 --> 00:22:29.430 Kari Weaver: I guess I'll share one other thing from that which is, you know, I had moments where I'm like, where are we at? Institutionally, in terms of you know how much Susie mentioned this. A little bit of how much we're integrating this, and in what ways. And I think the committee has really struggled with, what do we move fast on? And what do we move slow on right? What do we have? 100 00:22:29.530 --> 00:22:53.459 Kari Weaver: Are we worried about people graduating this year who may not have so much exposure and knowledge? Absolutely. But nobody wants to go fast with, you know, trying to get a course approved when we're not even sure what that should look like. So we've had lots of conversations about kind of grassroots, fast and formal, slow. 101 00:22:53.460 --> 00:23:06.290 Kari Weaver: and it was really nice to see that that was a common approach of other institutions, and made a lot of sense. So just even that comparison across institutions was really helpful. 102 00:23:06.420 --> 00:23:10.719 Kari Weaver: I'm for this next question. I think of 103 00:23:10.930 --> 00:23:34.149 Kari Weaver: a few of you are going to show some examples. So the question is, what is something you found you could do with AI that you could not do before. And what's important about that example. So if we can go in the the order of Scott, and then Jimmy, and then, Jason, love to see some examples and hear from you. 104 00:23:35.180 --> 00:23:58.159 sligon: Okay. So I just put a link in the chat to a Google drive folder that everybody has access to and can download the documents. There is a document in there that has to do with adobe, and like a little where to find and how to use some of the tools in adobe, and that's something that I'm 105 00:23:58.160 --> 00:24:13.499 sligon: sharing and talking to my foundation students about. And it also has these 2 other clips that I'll try to show you now. So I was making a movie share my screen. 106 00:24:15.320 --> 00:24:26.206 sligon: Let me see, I'm going to try using quicktime player specifically, and tell me if you see 107 00:24:27.490 --> 00:24:33.459 sligon: picture of me with a green screen, do people? Oh, here's share 108 00:24:34.460 --> 00:24:38.990 sligon: open system settings. Oh, good Lord, okay, hold on. One second 109 00:24:46.390 --> 00:24:49.719 sligon: does does do people see the green screen image. 110 00:24:55.120 --> 00:24:56.320 akohoot: No, not yet. 111 00:24:56.650 --> 00:24:57.410 sligon: Okay. 112 00:25:09.830 --> 00:25:11.680 sligon: Okay, let me try it a different way. 113 00:25:21.610 --> 00:25:30.140 sligon: I'll give up in a second if I can't get it. So, Cari, what did you use when you were sharing the screen? Just application windows, everything. 114 00:25:30.350 --> 00:25:36.410 Kari Weaver: Yeah, I went to share, and then I have a few monitors. So I shared my other full screen. 115 00:25:36.410 --> 00:25:42.869 sligon: Okay? So let's see. Now, can you see a picture of me with a green screen. 116 00:25:45.460 --> 00:25:46.180 Kari Weaver: Nope. 117 00:25:46.830 --> 00:25:47.659 sligon: It. 118 00:25:48.490 --> 00:25:48.990 Jason Tilk: On, mute. 119 00:25:48.990 --> 00:25:53.180 Kari Weaver: You wanna talk us through because you have links to the the clips, and if. 120 00:25:53.180 --> 00:25:55.059 sligon: Yeah, I actually, yeah, do you want me? 121 00:25:55.060 --> 00:25:56.009 Kari Weaver: To open one. 122 00:25:56.860 --> 00:25:58.420 Jason Tilk: Just curious can try to share it. 123 00:25:58.780 --> 00:26:00.110 Kari Weaver: Okay, which one. 124 00:26:00.330 --> 00:26:02.040 sligon: Okay, open the green 1 1.st 125 00:26:02.760 --> 00:26:05.719 Kari Weaver: There, I don't see. I see Max after. And Max before. 126 00:26:06.583 --> 00:26:07.649 sligon: Max! Before. 127 00:26:07.650 --> 00:26:08.390 Kari Weaver: Okay. 128 00:26:09.270 --> 00:26:10.970 sligon: Okay, let me come back to the 129 00:26:11.290 --> 00:26:14.762 sligon: classroom, and I'll unshare it just in case I get in your way. 130 00:26:17.070 --> 00:26:18.020 Kari Weaver: Okay, I. 131 00:26:18.020 --> 00:26:21.149 sligon: So can you see, I've not seen that either. Okay, wait. 132 00:26:21.150 --> 00:26:22.139 Kari Weaver: It's coming. 133 00:26:22.350 --> 00:26:23.000 sligon: Okay. 134 00:26:25.630 --> 00:26:51.029 sligon: see it. Okay, yes. Okay. So anyway, I was doing the movie that I was working on about my son, and there was a portion of it. Most of it is ad-libbed, but there was a portion of it where I'm actually reading to kind of speed it up and get some control over it, and it was several hours of recording, digested and compressed into just maybe 135 00:26:51.030 --> 00:27:18.409 sligon: 12 min of time. But I was looking off to my side at my notes too much, and it was really obvious, and it was a lot of work, and I thought it was a pretty good performance that would be difficult to duplicate, but my eyes weren't looking at the screen, so it was not usable. And that's kind of typical of some of my work that, you know, I'll get like 90% of things that I'm really happy with. But then there's this thing that makes me have to 136 00:27:18.410 --> 00:27:24.245 sligon: do work over again, so you can see the the I'm looking off to the side right here. 137 00:27:25.130 --> 00:27:34.730 sligon: And then if you'll play so I went to this place called captions.ai, and had them do eye contact a redo. 138 00:27:34.730 --> 00:27:36.449 Kari Weaver: Sometimes I love. 139 00:27:37.030 --> 00:27:53.450 sligon: So you can see it redrew my eyes, and I was able to use the footage instead of having to redo, you know hundreds or tens of, or a hundred hours of working, of talking again. So 140 00:27:53.560 --> 00:28:22.260 sligon: I felt completely comfortable with that in terms of ethical use, although it made my eyes a little like different and prettier, I thought, but I was so happy to be able to use the work that I put. You know all that work that I put into it, and all that sincerity, and it was able to to save it. So those eyes are AI generated also. It had 141 00:28:22.290 --> 00:28:47.189 sligon: some glitches, and it was experimental at the time, Captionsai, and they could only record 12 seconds of 4 K. Video at a time. So I had to put 12 seconds at a time in, and then re edit and re-sync everything up after that. But it worked, and occasionally my body would like swirl in an impossible way. 142 00:28:47.190 --> 00:29:05.620 sligon: Like as I gestured it would be like, and and I'd have to get rid of that and try it again. So it was a lot of physical work, but it was so much better being able to capture that performance and and keep it so you can see both of those clips in the adobe a folder in the folder that I 143 00:29:05.900 --> 00:29:07.699 sligon: put the link in there so. 144 00:29:08.400 --> 00:29:28.090 Kari Weaver: Great. Thank you. That's a that's a wonderful example. I enjoyed seeing that, and the story of you know how it, how much it saved. The authenticity of your initial read was so important, and that you felt like it was an ethical use of it as well, which which was really nice. Jimmy. 145 00:29:28.090 --> 00:29:28.720 sligon: Okay. 146 00:29:28.720 --> 00:29:30.739 Kari Weaver: Oh, wait! Scott! No! Go ahead! 147 00:29:30.740 --> 00:29:38.559 sligon: I think the thing that it saved that was human was so much greater than the thing that it replaced. That was human. 148 00:29:38.560 --> 00:29:40.350 sligon: Yeah, perfect. 149 00:29:41.110 --> 00:29:43.270 Kari Weaver: Jimmy, we'll move over to you. 150 00:29:44.128 --> 00:29:47.261 Jimmy Kuehnle: So so many things I can do now that 151 00:29:48.080 --> 00:30:15.720 Jimmy Kuehnle: there are large, statistically based knowledge generators. I've been able to code many more things while I was sitting here I've been running a model locally, and I now have a snake game that I can play on the screen. This is using the new deep seek model. If you've seen in the news that now you can do everything on your own computer without everything. So I did that right there. And I've been able to 152 00:30:18.530 --> 00:30:27.850 Jimmy Kuehnle: explore many topics that wouldn't be possible. Video generation, as you can see in the links I sent before, and also just 153 00:30:28.950 --> 00:30:51.000 Jimmy Kuehnle: use new teaching methods. For example, in my electronic arts class, I'm using AI generated code generators. And before that it was so hard to have the students be able to do the code, the building, the electronics, and the thinking. So at least we're taking a few of those away. And my, one of the things I'm most excited about is 154 00:30:51.070 --> 00:31:07.279 Jimmy Kuehnle: the multimodal models. So, for example, you can share your screen, or if I open up like a software program blender here that I have on the screen. It has that default blender cube, and if I don't know anything about Blender, I can go ahead and open up 155 00:31:07.890 --> 00:31:16.219 Jimmy Kuehnle: a software. I could run this on my computer. Here's my cell phone. How do I move the cube up? I need to move it. Can you help me out. 156 00:31:17.430 --> 00:31:29.490 Jimmy Kuehnle: sure to move the cube up. Just press G to grab it, and then Z to lock the movement along the Z axis that will let you move it up or down. Okay, how do I make it like softer? I needed to look softer 157 00:31:31.640 --> 00:31:38.579 Jimmy Kuehnle: to make the cube look softer. You can apply a subdivision surface modifier. Where is it on the screen and tell me where? I don't know what to do? 158 00:31:41.320 --> 00:31:47.970 Jimmy Kuehnle: Sure to add a subdivision surface modifier. Look to the right side of your screen for a wrench, icon in the properties. Panel. 159 00:31:48.110 --> 00:31:53.819 Jimmy Kuehnle: click that, then choose. Add modifier and select subdivision surface. You'll see it smooth out the cube. 160 00:31:59.100 --> 00:32:02.640 Jimmy Kuehnle: hey? That's great. Now, how do I make it look like a face 161 00:32:04.180 --> 00:32:32.910 Jimmy Kuehnle: to make it look like a face? You'll want to start by going into edit mode, selecting some faces. So I think there's a lot of potential for something. So it told me where to look on the screen. I didn't tell it what software I was using. It can do all kinds of things. What I'm really excited is. So here's a multimeter. I haven't showed the students how to use this. This is actually hooked up wrong, and if I plugged it into something it would. It would blow up. And so I want to see if it can tell me to do it right. 162 00:32:38.360 --> 00:32:45.210 Jimmy Kuehnle: Hey? I have this multimeter, and I want to check to see if a circuit has a current of 10 amps. Do I have it hooked up right? 163 00:32:46.710 --> 00:32:58.189 Jimmy Kuehnle: I see you're holding a multimeter to check. If a circuit has a current of 10 amps, you'd set the multimeter to the appropriate current setting. Yeah, what's crucial to? Yeah. Which one is it? And do I have the leads in the right spot 164 00:33:02.130 --> 00:33:08.089 Jimmy Kuehnle: for measuring 10 amps. The red lead should be in the 10, a socket and the black lead in the com socket. 165 00:33:08.220 --> 00:33:10.900 Jimmy Kuehnle: Am I? Am I right or wrong, do I have it right? 166 00:33:12.010 --> 00:33:17.189 Jimmy Kuehnle: It looks like the red lead is still in the voltage socket. Move it to the tanas. 167 00:33:17.470 --> 00:33:33.550 Jimmy Kuehnle: so it got it spot on it, got everything right it knew where to put it, told me where to put it, and it could see that my multimeter was hooked up wrong. It told me that, and I've done many experiments where I've used it to 168 00:33:33.550 --> 00:33:57.689 Jimmy Kuehnle: troubleshoot things it can. It's very helpful, and I'm very excited about what that means that my don't mistake my enthusiasm for lack of existential dread and concern and ethics. I figure the best way to talk about AI is to be well versed in it, and that's what I'm trying to do, because sometimes I'll hear somebody say something like even that student that said, you can't just tell 169 00:33:57.690 --> 00:34:11.190 Jimmy Kuehnle: Chatgpt to write an essay for me. I said, Hmm, I bet you you can. And so, right while they were doing that in the talk, I used deep seek to write an essay about how Chat Gpt doesn't let college students write essays. 170 00:34:11.190 --> 00:34:29.419 Jimmy Kuehnle: And I just said, just just make it for me. Just do it, anyway. And it did, and it wrote me a whole essay right there while the student was talking. So if you haven't done all the things, and then you say something just like in all avenues of knowledge, then it might not sound informed. So those are some of the things I've been able to do. 171 00:34:35.110 --> 00:34:50.721 Jason Tilk: Yeah. Yeah. Jimmy's use of the the AI tools has been like massively inspiring. I am. So from my point of view. Once again trying to 172 00:34:53.647 --> 00:34:57.430 Jason Tilk: Let's see if I can go into presentation mode and not 173 00:34:57.540 --> 00:35:03.039 Jason Tilk: write everything out. Are you guys seeing the the main screen? Or are you guys seeing my notes screen. 174 00:35:04.160 --> 00:35:05.340 Suzanne McGinness (she/her): Main screen. 175 00:35:05.340 --> 00:35:32.509 Jason Tilk: Okay, cool, cool. So I am. I. I've been approaching it as a collaborator. So how can AI be a collaboration tool? You know it means leveraging what AI can do and not replacing a human necessarily with it. And you can question what I'm about to show you because I even question it. Like I, I'm still trying to make up my mind on this stuff. 176 00:35:33.542 --> 00:35:35.773 Jason Tilk: So looking for a partner. 177 00:35:37.090 --> 00:35:54.505 Jason Tilk: so I'm I'm this is approaching it from the the design side of things. I'm gonna give it. I'm gonna talk about giving an example to sort of like use. AI as as a collaborator in my brainstorm process. 178 00:35:55.120 --> 00:36:11.140 Jason Tilk: so my question to AI and I was using Chat Gpt for this. Just the regular old login, one what verbs are associated with coffee and sustainability, because I want to design branding for a sustainable coffee shop. 179 00:36:11.560 --> 00:36:30.538 Jason Tilk: So it gave me some verbs, because this is the kind of stuff that you do. You get these words out? You get action words out. You get adjectives out, what are some adjectives? This is the equivalent of me going to a thesaurus, but it did it in seconds rather than me. Hunting for the words which I think is kind of cool. 180 00:36:31.140 --> 00:36:41.260 Jason Tilk: I'm gonna argue at that point that the energy trying to think of the words on your own makes you think of other things rather than just being fed stuff. So once again, I'm 181 00:36:41.260 --> 00:37:06.250 Jason Tilk: I'm still making my mind up on this. So I use that. And I then I basically just ask Chatgpt, what could a brand look like for an ethical sustainable coffee shop? And it outputs a name, a logo concept, a color palette to use, and a tagline I'm like, oh, that's pretty cool. Well, I kind of feel like I might be able to sort of sort the logo stuff out on my own or work with a graphic designer 182 00:37:06.250 --> 00:37:08.660 Jason Tilk: to do that. And you know. 183 00:37:09.120 --> 00:37:36.795 Jason Tilk: But maybe I want a different name. I'm gonna I'm gonna rely on the things that aren't visual art related. So I'm gonna I'm gonna focus on the words and the writing portion of it. As my teammate. Give me 5 options for a name previously verdant brew so the result. Here's 5 fresh sustainable focus names for your coffee shop, everbean roots and roast the Green Cup. Flourish Coffee Company. 184 00:37:37.210 --> 00:38:04.069 Jason Tilk: pretty cool I'd like 5 different names. Based on the Everbean Cafe. I like I like that. I like that. So we've got ever brew verdant evergreen bean and bow and perennial bean. Pretty pretty cool. I think it's interesting. That Verdant came back again, after, you know, after the 1st after me, asking for different things. 185 00:38:06.578 --> 00:38:21.549 Jason Tilk: and then I was like, Give me some taglines. Give me some fuel on this that I can that I can think of and then it gave me a mess load of taglines here some of them I hate, and some of them I don't 186 00:38:21.650 --> 00:38:51.330 Jason Tilk: like. And then some of them are okay. But at the end I was like, Well, I'm going to pick a couple of words out of there that I did like and make my own up. I'm going to keep being in bow, and I'm going to say, naturally crafted with care. Because I want to push that sustainable side of it. But you know it was. It was a combination of the brood from nature, crafted with care, which made me think of naturally crafted with care. And I'm like, Okay, that's pretty cool. So I'm it's being a. It's being an editor, I'm being an editor, and I think that's kind of neat. 187 00:38:52.556 --> 00:39:01.503 Jason Tilk: Then just basically asking for design briefs straight from chat Gpt, could. 188 00:39:02.370 --> 00:39:10.435 Jason Tilk: you know, what would the design brief actually look like on this? And you know I'm not going to go into all the words on there, let you guys sort of scan that. 189 00:39:11.110 --> 00:39:40.529 Jason Tilk: This is all stuff that I could totally do, but it's been done instantly for me, which is kind of Nice, which is, gonna allow me to then generate my own. I can get right into the visual side of it. I can get right into the thinking side of it. If I'm a graphic designer, I can get right into the branding and what I want it to look like, and I can use this as a springboard. If I'm an interior designer, I can start to use this to start to understand what I want the interior to look like. I asked for 2 design briefs. 190 00:39:40.530 --> 00:39:52.939 Jason Tilk: and the second one, I thought, was pretty cool, because it had this idea of it threw out this idea of a living wall down with a statement feature on the bottom of it. I also found it interesting that 191 00:39:53.690 --> 00:40:09.670 Jason Tilk: they it wasn't necessarily the same categories in each one which I think was neat. It, you know it was good fuel for thought and then, you know the typical process. I'm gonna then, you know, I'd probably go out in the real world. I'd take photos 192 00:40:09.890 --> 00:40:19.490 Jason Tilk: document stuff hit up Google, do a Google image search using some of the words from it. And the funny thing is, is, I am 193 00:40:19.610 --> 00:40:42.129 Jason Tilk: 100 sure. The image in the top right corner is AI generated so, and it makes me start to question all the other ones as well. So even just a Google image search. So I'm like, well, what if I actually then start to use AI to generate my mood boards and and give me fuel. I don't want to. 194 00:40:42.240 --> 00:40:44.160 Jason Tilk: Oh, what happened here? 195 00:40:46.880 --> 00:41:02.239 Jason Tilk: Where's my cursor? Okay, I don't want to like really design the shop, but I for just for the hell of it. I threw the the 2 design briefs in right in the mid journey as prompts and it developed. 196 00:41:02.250 --> 00:41:30.999 Jason Tilk: you know, some good coffee shops. The funny thing is, is, I kind of feel like I saw some of the images on Google that this was referencing as somebody that's critical on the process. I'm trying to evaluate that. Do I like these? Oh, sure, whatever they might be good for an image board. My favorite part is the fact that AI couldn't sort out what a living wall was. So it just wrote living wall on a wall. I think that's absolutely hilarious. 197 00:41:31.514 --> 00:41:49.025 Jason Tilk: So. But in reality I can't do much with this in my head as an as a designer. So I said, abstract, clean lines, uncluttered spaces, upcycled color. I just tried to mix up a lot of the words that I was given through chat gpt and 198 00:41:49.740 --> 00:41:56.977 Jason Tilk: this is what midjourney created. I threw in more stuff more of the words 199 00:41:58.120 --> 00:42:00.029 Jason Tilk: And then I was like, wow! This is 200 00:42:00.540 --> 00:42:22.349 Jason Tilk: weird. I don't. It's not really fueling my giving me any fuel. So I got rid of the word words vertical gardens in here and now, all of a sudden, I feel like I've tailored mid journey to give me something that's a good mood board that I can then start to sit down and draw with and start to concept. So and this is all within 201 00:42:22.560 --> 00:42:41.810 Jason Tilk: I, the the longest thing was actually making the making the presentation out of this rather than rather than the actual, the actual work associated with like generating the words and the like. So my big questions on this are. 202 00:42:41.810 --> 00:43:04.309 Jason Tilk: Do I like it as a collaborator? I totally do. It makes things go faster. My criticisms on it are, I feel like when I sit there, and the smoke starts to come out of my ears, and I'm like pouring over like, well, what's another word for? There's something that happens in that moment that AI is replacing. And I'm questioning. If. 203 00:43:04.440 --> 00:43:07.410 Jason Tilk: yeah, I'm just questioning, if it works. 204 00:43:07.700 --> 00:43:11.343 Jason Tilk: is, is it right? Is it a benefit 205 00:43:12.620 --> 00:43:32.119 Jason Tilk: and professionally, in my consultancy work. I I use it sometimes, and then sometimes most of the time. I don't at this point in time, but still. That's the that's the purpose of why we're all here is to learn and understand how it can be integrated, and then how we can share it with our students to make their lives better or faster, or 206 00:43:34.240 --> 00:43:38.039 Jason Tilk: being part of the design world that is trying to use it. 207 00:43:39.440 --> 00:43:41.130 Kari Weaver: Yeah, thanks so much. It's 208 00:43:41.540 --> 00:44:02.169 Kari Weaver: that kind of example is so useful to see. You can start to see what skills and knowledge you needed for that interaction and what you didn't need. And you know, I think also, when we start to see some of this output, we're like boring right like I. It just looks the same. And I think that's 209 00:44:02.680 --> 00:44:19.729 Kari Weaver: good that we're seeing that in many ways of like. No, there's so much space for creativity. Right? There is a there is a blandness to this without that kind of creative input. And so it's it's nice to see that. Yeah, Jason. 210 00:44:19.730 --> 00:44:38.689 Jason Tilk: Yeah, you know, backing you up on that completely like when I said, Is this a good good coffee shop? Yeah, sure. Sure it's good. But you want to know what it's not indicative of something that I want that I expect to see out of our our students yet alone. A professional. That's, you know, an alumni from the Institute designing interior spaces. 211 00:44:38.690 --> 00:44:50.370 Jason Tilk: So yeah, it's that, you know the the critical thought. And I think all of our students are already saying that. So I think it. It's putting us in a good position just backing you up on that card. 212 00:44:50.370 --> 00:44:58.680 Kari Weaver: Yeah, thank you. We have one last question, and then we can get to an open discussion. And 213 00:44:58.680 --> 00:45:23.919 Kari Weaver: I'm excited to end on this question, what big questions or thoughts have you been wrestling with as a result of this critical engagement with generative AI or with AI in general? And I have an order for this one. We're going to start off again with Jason, then Scott, Susie myself, and we'll close with Jimmy. So right back to you, Jason. 214 00:45:23.920 --> 00:45:37.015 Jason Tilk: Yeah, sure. And honestly, like, I'm thinking a lot along. You know, my big takeaway is kind of where I left off a little bit on that I'm seeing I like it. When 215 00:45:38.200 --> 00:45:55.470 Jason Tilk: I can use something that helps me in something that's not my core competency. I I joke. Oftentimes I've been in meetings with Ceos talking about products, and I'm like, well, I went to art school, but I don't think that this, you know, like. There's all these business people in the room, but it's my adjacency that. 216 00:45:55.470 --> 00:45:56.050 Rachel Ferber (she/her): We're getting. 217 00:45:56.050 --> 00:46:23.329 Jason Tilk: In that case brings a level of focus to the conversation. So in this case, yeah, I went to art school. I you know I wasn't. I'm not a writer. That's not my, that's not my core skill. We have amazing writers at our art school now, and we probably did when you know, in 97, when I graduated. But it just wasn't me so like when I see what what AI can help me out with. I love it. Being an editor I love. 218 00:46:23.390 --> 00:46:37.830 Jason Tilk: you know I love it, being adding tone, you know I can add tone to something and say, be more illustrative on this, so I can sound happier professionally. I I use it to help me write 219 00:46:39.631 --> 00:46:41.938 Jason Tilk: press releases for my theater. 220 00:46:42.560 --> 00:46:46.500 Jason Tilk: And but at this 221 00:46:46.680 --> 00:47:03.377 Jason Tilk: so so I use it for that. But then what I find is is now, because I've done it so many times, or or done it. Now I write even better in my regular life. So I'm not so it's ended up teaching me which is kind of funny, I'm like, oh, well, this is how we 222 00:47:04.490 --> 00:47:21.469 Jason Tilk: Oh, this is how I might want to really word this so and all of a sudden the input gets better. And and once again it is garbage in garbage out like. If you don't put the right stuff in, you're not going to get what you want out of it. You know, Jimmy, I'd be interested to read the essay that you're that you're 223 00:47:21.720 --> 00:47:28.331 Jason Tilk: deep fake wrote or not. Deepfa. The other one wrote about Chat Gpt writing an essay? 224 00:47:29.236 --> 00:47:41.973 Jason Tilk: because it doesn't sound like there was that much input into it. So you know it. Honestly, it's a battle. It's like, well, am I making myself worse? Or am I making myself better? And you know definitely 225 00:47:43.260 --> 00:48:01.337 Jason Tilk: I am. I just use it for fuel at this point in time? Because I I wanna I'm too arrogant to say I want something else to make my thing for me. I wanna make it so. I don't know 2 cents. Hopefully, I didn't diverge too much from what we previously chatted about. But 226 00:48:02.220 --> 00:48:29.480 Jason Tilk: And I think the the other big thing is is paying attention to what the industry is doing professionally as as an educator. And you know, as a designer educator working with companies that come to the Institute, and they're like, you know, telling the students to use AI for stuff because they're trying to use it, because I think companies are still trying to wrap their head around what it is. So once again, that makes an opportunity for us as a leading art school and institution. 227 00:48:29.980 --> 00:48:45.169 Jason Tilk: to guide what the use is for this. So I'm super excited to see where we all end up. Once again. I'm soaking it in and synthesizing. And yeah, just excited to work with our team because we have had amazing conversations amongst us. 228 00:48:45.647 --> 00:48:48.079 Jason Tilk: and I'll pass the mic to Scott. 229 00:48:48.080 --> 00:48:48.953 sligon: Thank you. 230 00:48:49.850 --> 00:48:58.158 sligon: let's see, I'm old enough to be have been excited about the Internet, the sum total of human knowledge at your fingertips. And 231 00:48:58.700 --> 00:49:26.230 sligon: you know, instead, I think the Internet has been used to spread misinformation and confirm the biases that people already had. So I guess that's the why I'm saying that is, we're we're not trying to sell you something. We're not trying to come in and say, Yeah, we're going to do these things that we've decided now. But we want to have a conversation about something that's such a such a powerful tool. And I think 232 00:49:26.800 --> 00:49:39.460 sligon: at its best, it's it evens the playing field like a person now and increasingly in the future, can sit at home and produce a movie by themselves that is more or less 233 00:49:39.830 --> 00:49:50.680 sligon: increasingly equivalent of something that was funded by a lot of money in a big corporation, and I don't see that as a bad thing, being able to have 234 00:49:51.640 --> 00:50:21.409 sligon: customization in terms of being able to have a 1 assignment be presented different ways for different learning abilities or different backgrounds, so that it's more understandable or relatable to people powerful tool. So I think this is a conversation with you about what we should do about these powerful new tools. It's not something that's handed down from us and in terms of 235 00:50:22.500 --> 00:50:40.320 sligon: recommendations. And we recognize the negative stuff and the scary stuff as well as the positive stuff. And the truth is, no one knows how this is going to end up, you know, and but we can help shape it along the way which can help determine how it ends up. 236 00:50:42.170 --> 00:50:43.719 sligon: So that's it. 237 00:50:45.440 --> 00:51:13.369 Suzanne McGinness (she/her): Thanks, Scott, I'll jump in. I've been just humbly observing this year, as I've had little to no experience with AI, and I've learned so much from the conference, but as well as my peers, especially Jimmy's contagious enthusiasm for play, I think, has been really inspiring me to understand how this can be so helpful in boosting creativity. I as someone who 238 00:51:13.370 --> 00:51:38.360 Suzanne McGinness (she/her): kind of has been limited by oh, well, I've never done 3D modeling. I don't have time to take a course in it. It's something I've always wanted to try. But you know, in the course of 5 min Jimmy made a 3D. Model of an illustration I had on Instagram, and it really opened my eyes to understanding that I'm no longer limited by the tools to create. And I think the keyword being tool to understand how to use 239 00:51:38.360 --> 00:51:54.600 Suzanne McGinness (she/her): some programs to focus on my creativity instead of getting bogged down by like the processes of using and memorizing quick tools, and all of these other things that are so useful, and tools like Blender, or whatever it may be after effects. So 240 00:51:54.680 --> 00:51:58.670 Suzanne McGinness (she/her): yes, it. It has been enlightening and exciting. 241 00:52:01.660 --> 00:52:05.469 Kari Weaver: Thanks. Yeah, I I echo that very much so. And it. 242 00:52:06.190 --> 00:52:21.879 Kari Weaver: I think a lot about where we are, how we understand developmentally, at what stage, what knowledge is needed. Right? Because I think for all the people on this committee, we can see, especially in our professional areas like that. 243 00:52:21.960 --> 00:52:37.129 Kari Weaver: Our specialized knowledge is so important to how we use it, and some aspects are are absolutely crucial. Even though it's changing the way in some ways that we can work or could work. You know that that 244 00:52:37.490 --> 00:52:58.809 Kari Weaver: we have to have a certain level of knowledge in many areas and a real ethical grounding, right in what we're doing and what we want to do. And I so appreciate the points about like we have an opportunity to shape industries and how they use these tools right, and be leaders in that. That's so exciting for me 245 00:52:58.830 --> 00:53:09.240 Kari Weaver: the the question and thought, I've been kind of pondering a lot, and I've shared this with a couple of you in talking about AI. 246 00:53:09.310 --> 00:53:10.170 Kari Weaver: But 247 00:53:10.220 --> 00:53:37.609 Kari Weaver: I feel like the more we see what AI can do, the better understanding we have of what it can't do, which is so much more human, like. What it cannot do is what we do as humans. And so I think maybe we can spend more time there. I'd like to spend more time doing the more human things the things that mean a lot to me. And maybe I don't get, because. 248 00:53:37.610 --> 00:53:44.260 Kari Weaver: you know I there are tasks. I do that that could use some efficiency right? There are things in my life that 249 00:53:44.300 --> 00:54:04.509 Kari Weaver: you know. I don't enjoy spending much time on in my work, and maybe those could be more efficient, but still high quality where I could spend more of the the time and energy I have in the more human kind of critical aspects, and that that really excites me. So yeah, we'll wrap up and finish with Jimmy. Now. 250 00:54:06.540 --> 00:54:07.255 Jimmy Kuehnle: So 251 00:54:08.100 --> 00:54:33.880 Jimmy Kuehnle: I'm so excited just about how fast everything is moving. It's just amazing. We keep saying what humans can still do, that that domain is getting smaller and smaller. So Jason had that coffee shop. So here is Sora Video. I made warm coffee shops filled with fuzzy creatures. It's filled with all kinds of bunnies. And it's not just an image. It's a whole video. And then there's another video and another video. Here's a whole new coffee shop 252 00:54:33.880 --> 00:54:50.579 Jimmy Kuehnle: 10 seconds of full video. You see, all those buddies are so cute. That's the way I would take the video, too. And then, while I was doing that I had to make another video. And so here are a field filled with gorillas and art students. So it's just 253 00:54:51.330 --> 00:55:09.690 Jimmy Kuehnle: gorillas talking to art students and art students are drawing, and you can see them there. These are all fake people. None of this is real, all just generated. Right? There. Wow! I talk. Look at those 2 gorillas. They're so nice, and look at them. They're all they're all sitting there. They're drawing. The gorillas are drawing the gorillas like eating the 254 00:55:10.170 --> 00:55:17.609 Jimmy Kuehnle: the pad so you could go on and on like this. It just it just keeps. It just keeps going. And all of this 255 00:55:17.620 --> 00:55:37.939 Jimmy Kuehnle: you could just stop anywhere. And you could just see piles and piles of video and different things cats. This is an old Japanese storybook about these farmers and a turnip, and they're trying to get it out. And I wanted to see what that looked like in real life. And that's all really exciting. But the the 256 00:55:37.940 --> 00:56:03.110 Jimmy Kuehnle: the anecdote that I give that gives me both dread and hope. During the Iml training I was goofing around with these image video generators, and it was super fun. And there was another faculty there, Nick Lasons and I was showing him a claymation video that was created. And he was just shocked like, oh, my God, that is so, it's so good! And 257 00:56:03.140 --> 00:56:07.440 Jimmy Kuehnle: then the without missing a beat, he said, oh, I can make a movie 258 00:56:07.860 --> 00:56:25.860 Jimmy Kuehnle: meaning that. Yes, all of my training as an animator may soon be less good, but that means I have more things. But then, right back to existential dread. So I think there's going to be a lot of problems of how we deal with 259 00:56:26.040 --> 00:56:36.780 Jimmy Kuehnle: what happens, because it's only gonna get better. I and I made, I don't know maybe 10 h worth of synthetic video over the winter break because it was just 260 00:56:36.840 --> 00:56:59.780 Jimmy Kuehnle: piles and piles. You can just do it. And it's so much that there's not even. It's kind of like standing in front of Niagara Falls. There's just no stopping it. So you you might as well get a really strong barrel and try to go over the edge, because, you know here these bunnies aren't going to. Oh, I made these, bunny. These bunnies are so cute they are all holding on. AI is anti-human and anti creative signs. 261 00:56:59.780 --> 00:57:15.320 Jimmy Kuehnle: So it's it's really great. You know, they're protesting. These bunnies are protesting. AI. So it's it's really fun. So I'm going to leave it there. I could go on and on with all the different things and the problems, and it's both excitement and terror as I think anyone should be. 262 00:57:17.590 --> 00:57:22.369 Kari Weaver: Thanks so much for that. And I just, you know, on a critical eye I just noticed they're like 263 00:57:22.840 --> 00:57:25.610 Kari Weaver: the landscapes look very much 264 00:57:25.730 --> 00:57:42.969 Kari Weaver: kind of one note, right? They, the people mostly white, right? Like it. It just it. It has that aspect to it, too, and we have a responsibility to not have those kinds of representations. 265 00:57:42.970 --> 00:57:45.557 Jimmy Kuehnle: Absolutely absolutely. I 266 00:57:46.870 --> 00:58:04.039 Jimmy Kuehnle: I'm very cognizant of that. And I made a series of videos. If you look at the 2 videos on the the link that I shared it was intentional to see. Does the AI have it in it to represent the entire world? And those are tours of the world 267 00:58:04.040 --> 00:58:23.880 Jimmy Kuehnle: with farcical prompts. And yes, it can. It is. I bet you there are some IP things, I bet you, if you typed in make a field of people in a different location. It would make some assumptions for you. But who gets to choose what those assumptions are? Because all the data is in there, and it really can. It can make 268 00:58:24.000 --> 00:58:42.270 Jimmy Kuehnle: anything you want as long as you describe it. But when you say, like the classic example, when the image generators came out, and you say, Give me a picture of a doctor or a CEO. It was not the best situation, and so I think we should always be hyper vigilant about about that absolutely. 269 00:58:42.270 --> 00:59:11.729 Kari Weaver: Yeah. Yeah. And that's helpful for me in thinking about where we might need to change our curriculum right? What we might need to add more of in in terms of supporting our students, critical thinking in terms of understanding history, cultures, differences there. It's so it's so crucial in this. So thank you all the committee members for for answering those things and sharing those. I want to open it up, and I haven't really looked at chat. 270 00:59:11.730 --> 00:59:25.360 Kari Weaver: But I want to open it up to discussion. Any questions people want to ask to the committee members comments, things you put in chat that you want to lift up. So yeah, welcome discussion now. 271 00:59:58.600 --> 01:00:05.699 Kari Weaver: So we have a couple of points in chat and I guess I should say before, in case people jump off early. 272 01:00:06.100 --> 01:00:08.110 Kari Weaver: I'm just gonna remind you. 273 01:00:09.486 --> 01:00:34.313 Kari Weaver: Hopefully, that link works. Make sure you report your attendance. Because we have. I don't know if you've seen it on our my CIA page. Let me know if the links don't work there. But we're asking people to really try things out. Do some learning do some experimentation and report that and just looking for, you know, 274 01:00:34.970 --> 01:00:56.780 Kari Weaver: a lot of attempts to think about it, to look at it, to experiment with it and to investigate your reactions to that, because that's really important. It will also give us a lot of data on what you're interested in what you're looking at, what you're thinking about and feeling. So please please take part in that. 275 01:00:56.870 --> 01:01:09.679 Kari Weaver: I wanna call out a couple of the points in in chat. And I'm looking the 1st one I'm looking up at is Michelle's point. So 276 01:01:10.460 --> 01:01:18.649 Kari Weaver: potential challenges we could encounter developing content and messaging around a decentralized set of tools. 277 01:01:19.037 --> 01:01:23.580 Kari Weaver: And this is a long comment, Michelle, can you talk us through this? Would you be willing. 278 01:01:28.350 --> 01:01:38.774 Michelle Eisen: Sure I realize I kinda like spewed out a bunch of words. These are some things that I've been meditating on for like last couple of years, as this conversation has 279 01:01:39.997 --> 01:01:41.789 Michelle Eisen: kind of shaped 280 01:01:42.190 --> 01:01:48.738 Michelle Eisen: out, and seeing it a couple of years ago, when I was in school, and the conversation was very different. 281 01:01:49.540 --> 01:01:50.900 Michelle Eisen: My! 282 01:01:51.180 --> 01:01:57.000 Michelle Eisen: If you'll I don't know if it's an interest or a challenge or something, I'd like to see 283 01:01:57.580 --> 01:02:08.000 Michelle Eisen: how it shapes out as this starts to become institutionalized is how we message around a set of tools is not really unified by a specific 284 01:02:09.730 --> 01:02:18.680 Michelle Eisen: order set. So like when we think about like teaching digital tools. We're looking at industry data. We're seeing. Okay, the adobe suite is still 285 01:02:19.070 --> 01:02:24.991 Michelle Eisen: for the time being, the the major tool set that students will most likely need to learn. 286 01:02:25.510 --> 01:02:27.819 Michelle Eisen: And we're able to 287 01:02:27.930 --> 01:02:35.119 Michelle Eisen: kind of develop really consistent messaging around that. This only came to mind really recently with deep seek. 288 01:02:35.880 --> 01:02:45.679 Michelle Eisen: It made me very aware that none of these tools are really interlinked with anything else. These models are either built upon like a central model, or they're these 289 01:02:45.790 --> 01:02:55.480 Michelle Eisen: offshoots. I wonder how that conversation builds out as we look at potentially educating students developing curriculum and content around this. 290 01:02:55.680 --> 01:03:06.229 Michelle Eisen: how do you tackle that issue of of this being kind of a web of loosely connected? Tenuously so. Tools that are maybe not 291 01:03:06.360 --> 01:03:16.619 Michelle Eisen: intercompatible, intercompatible or linked? I don't know. I'm just really interested in seeing what that could look like where we're at. Currently. 292 01:03:20.690 --> 01:03:30.364 Jason Tilk: I think the the that you're as as you're talking about the the tools and them being sort of disparate. This is a weird adjacency. And 293 01:03:31.050 --> 01:03:56.640 Jason Tilk: you know, and it's self referential and no disrespect to Carl Floyd, who was once the the head of the sculpture department here. But the the sculpture department had all these tools in it, and kind of nobody used them. There was a forge there, there was something else. There was a welding. And basically, it was kind of like this, weird, free for all. And the students just would sort out how to use it. And that level of like 294 01:03:57.210 --> 01:04:04.905 Jason Tilk: sandbox playground is kind of like. I feel where we are right now, and your points are so valid that 295 01:04:05.840 --> 01:04:21.880 Jason Tilk: like as it moves forward, and there's some level of cohesion towards it. I I think we'll we'll get to that point that we we start to understand it. And I think what's great is you? You've you've come immediately with like this, like 296 01:04:22.260 --> 01:04:24.679 Jason Tilk: critical point of view on it, like. 297 01:04:24.890 --> 01:04:38.285 Jason Tilk: And and that's what we all need right now we need to. And I think I don't know. We we gotta make sure our our students share that level of. And I I'm sure they already do that I I hear them talk? 298 01:04:39.304 --> 01:04:52.649 Jason Tilk: but yeah, maintaining that like, well, how does how does this really get either commonized or what's the appropriate use? Or you know, yeah, it's interesting. You just made my brain go sparkles. 299 01:04:53.380 --> 01:05:15.730 Jimmy Kuehnle: I think it's interesting that adobe was mentioned, because, it's all baked into adobe already. Adobe is all in on. AI Microsoft is all in on AI. They just changed their entire development team about 2 and a half weeks ago to be under AI umbrella umbrella, all of them the windows team, the the azure team. 300 01:05:15.740 --> 01:05:39.799 Jimmy Kuehnle: everything that is trillions of dollars. Meta is doing the same thing. Adobe is like, if you don't use this, you're going to be behind. They're the creative company. That's the CEO of adobe. And you can use it in Photoshop right now, you can extend videos in premiere right now, adobe is just about to release its video generator. And so is Google. Deep seek is really fascinating because you can run it 301 01:05:40.010 --> 01:05:49.739 Jimmy Kuehnle: off the Internet. So I'm running deep. Seek right now on the Mac Mini behind me. Locally, I can run it in a browser without any connection to the Internet. It's 302 01:05:50.280 --> 01:05:56.439 Jimmy Kuehnle: it's as good as almost as good as Gpt. o. 1, and you can just have it 303 01:05:56.570 --> 01:06:19.500 Jimmy Kuehnle: like in the woods with a solar panel. That is a fascinating development. And it's going to be interesting to see what happens in the next the next couple of years and how that goes. But I but I think it's going to be more and more intertwined into just everything. And who gets to make the decisions on how that is is going to be Google Microsoft. And maybe maybe deep seek. Now if Nvidia doesn't do it. 304 01:06:22.330 --> 01:06:26.229 Kari Weaver: I think these all these questions also come with you know what? 305 01:06:26.670 --> 01:06:30.069 Kari Weaver: What are the ramifications of our choices of tools? 306 01:06:30.738 --> 01:06:47.019 Kari Weaver: You know it makes me think even of shifts to open educational resources. And, you know, reducing the costs of our students to have access to the tools and knowledge that they they need. 307 01:06:47.020 --> 01:07:07.950 Kari Weaver: I think about students choosing majors based on cost, sometimes right, that are associated with those different areas. And we can certainly see so much stratification in access to tools and additional costs that go along with those. So I think we need to be really careful about 308 01:07:07.980 --> 01:07:32.199 Kari Weaver: what we do. Choose, what we invest in. You know how we talk to students about these tools and have them think about the costs associated with that as well, because those those you know, those choices do promote certain things right? They align with certain areas. So it's we're in a really interesting area of experimentation. And 309 01:07:32.330 --> 01:07:46.709 Kari Weaver: you know it's been nice for me, at least through the Nord Center, to fund some of that the tool usage in the classes so that it doesn't come at an extra cost. But we can't. You know, that's small scale. We can't continue to do that. 310 01:07:46.730 --> 01:08:10.210 Kari Weaver: How do we make decisions about who has access to what kinds of tools? When do people rely on open access, open source tools? And when do we need the the, you know tools that have a cost and give our students an edge in many ways. So I think it's a really complicated question that you bring up Michelle. 311 01:08:19.189 --> 01:08:20.090 Kari Weaver: George. 312 01:08:20.410 --> 01:08:36.949 George Ramirez: Hi! Yes, 1st of all, Hello, everyone. I'm George Ramirez. I'm in the Liberal Arts in case I haven't had a chance to meet you. It's my 1st year at CIA. I'm sorry if we discussed this already. I had to jump in late because I was doing search interviews. But I'm 313 01:08:37.090 --> 01:08:43.795 George Ramirez: I have a question and a comment. I guess the 1st is, what is, and this is post to everybody's like 314 01:08:44.399 --> 01:08:51.689 George Ramirez: I'm interested to hear how you would all like to see 315 01:08:52.520 --> 01:08:57.919 George Ramirez: this like critical approach to AI being integrated into this very enthusiastic 316 01:08:58.529 --> 01:09:03.209 George Ramirez: move that we're having across the departments. And the Iml. 317 01:09:03.666 --> 01:09:16.999 George Ramirez: just cause. Right, we're, I think we're. We're saying, like, Okay, this is a great tool. But at the same time we need to teach our students how to navigate it. Appropriately. So I just want to hear one. Yeah, about like 318 01:09:17.200 --> 01:09:19.890 George Ramirez: how you would all like to see that 319 01:09:20.229 --> 01:09:36.970 George Ramirez: critical ethical societal component of the of of AI being integrated into our curriculum. And then the comment is more toward like everyone. I'm personally very interested in like creating a course or a 320 01:09:37.359 --> 01:10:07.160 George Ramirez: or you know, using my media studies course, that I'm proposing to have some sort of critical AI studies, or like AI and ethics. So if anyone is interested in collaborating or talking to me about that, I would really really love that because my background is in media studies. So this is totally up my alley, and I would love to hear anyone's thoughts about that, whether that's here or one on one. So thanks. 321 01:10:12.060 --> 01:10:37.090 Kari Weaver: So I'll jump right in, because this is where I you know what I think about all the time, but I would be interested in hearing from others. I think that we need to start with ethics. I think everybody wants to have a level of trust there, that that's the position we're coming from, and it aligns with our institutional values. Right? 322 01:10:37.920 --> 01:10:55.260 Kari Weaver: People will struggle to innovate and feel freedom for creativity without knowing that they're aligning with something that's ethical. Many people, right? And I think about how we develop our curriculum in this. And I think we need room 323 01:10:55.260 --> 01:11:18.220 Kari Weaver: right in many ways for choice. Right. I think students need some choice and flexibility. If you're going to offer an assignment and you're saying we're trying AI for this, can they? Opt out? Can there be some choice in that? Can they use a different tool? And if not, why not right? Why is that choice? Not there? 324 01:11:18.390 --> 01:11:25.670 Kari Weaver: We can think about it in in terms of giving people access like 325 01:11:25.830 --> 01:11:31.770 Kari Weaver: the panelists are talking about. It feels weird to call you panelists. My peers are talking about like 326 01:11:31.830 --> 01:11:59.910 Kari Weaver: how, you know. Oh, I couldn't really work in this kind of area before, but now I can. And so if you think about movement across major and attempts in other areas, even that access to being able to maybe be in print. But wanting to work a little bit more with animation. Is that possible. And can we do that in ways that can both. Maybe people are utilizing AI for the skills they're not 327 01:12:00.210 --> 01:12:22.589 Kari Weaver: as interested in developing. And we have students who need a different skill set right. So differentiated learning outcomes in there. Can we have courses that have a more flexible kind of titling and description? Right. I was talking to this group about, we just need a class for like, let's just mess around with stuff. 328 01:12:22.948 --> 01:12:33.340 Kari Weaver: And let's just innovate. And let's do that in a guided way. I didn't use those terms, but you remember what I was talking about there, probably, and 329 01:12:33.800 --> 01:12:43.649 Kari Weaver: in that, how do we? How do we structure assessment and grading so that it actually allows for that flexibility? How do we account 330 01:12:44.030 --> 01:12:46.240 Kari Weaver: for people who 331 01:12:46.270 --> 01:13:11.050 Kari Weaver: students who absolutely, ethically, will not want to do this and will, I think we will probably see more increases in areas with like high touch craft skills, right? Really, hands on skills and tools, there will be a desire for that. And so where are we not just making room for increasing access and understanding of it. 332 01:13:11.050 --> 01:13:20.720 Kari Weaver: but also pushing back against it. Right? Of of student desire to push back against it. And 333 01:13:20.720 --> 01:13:49.029 Kari Weaver: I think also about as we're working towards curriculum that has this more formally. How can we better include students so that they're learning about it as we're creating it? And they don't have to wait until we have a formal class to have some access to it. Right? So how can we include them in curriculum design? We have a lot of students as partners, kind of areas for curriculum design. How can we do more of that here? So I 334 01:13:49.080 --> 01:14:00.240 Kari Weaver: I, George, I think all of these areas about curriculum. But I feel very strongly that we need to start out with ethics, and I'd love to hear what other people have to say, too. 335 01:14:00.640 --> 01:14:18.890 Jimmy Kuehnle: Want to jump in really quick about the opt out. I think, of course, you can never force anyone to do anything that's absolutely paramount. But I think if we, as we allow students to opt out, we need to, we owe them to teach them how much they will be 336 01:14:19.180 --> 01:14:48.869 Jimmy Kuehnle: potentially disadvantaged. And behind, I just installed Photoshop on this computer because it wasn't there. I took a screenshot of the splash screen of Photoshop. Has this gentleman here with Photoshop. Only I said, add a hat. There's a hat. Okay. Now add sunglasses. There's sunglasses. Add a bow tie. If you want to sit around and try to do that in Photoshop. Your peers are going to be beyond you, so don't get good at that anymore. Don't get good at selecting things in Photoshop. Don't get good at making images in Photoshop. 337 01:14:48.940 --> 01:15:17.810 Jimmy Kuehnle: That's over. That is not a thing you could do it with these AI tools now. And so you could still need to make great images, but you got to do it in a different way, and I think that getting good at the things that AI can't do so far AI cannot need a hat. AI cannot make an immersive environment. AI cannot make the tactile things to me. There is great opportunity to say, I can make even better tactile, wonderful things through using these other tools because 338 01:15:17.810 --> 01:15:25.030 Jimmy Kuehnle: the essay train has left, the image train has left. The video train is about to leave. 339 01:15:25.470 --> 01:15:39.259 Jimmy Kuehnle: It's going to. It's going to. It's well it's boarding. Let me put it that way. It's boarding. I agree with your your shrug there, Kari, but I think that we owe the students both, allowing them not to use it, but we also owe them an education about what the tools are. I think. 340 01:15:40.630 --> 01:15:50.499 Jason Tilk: Yeah, Jimmy, to back back that up. In our. In our solo conversations as the committee I brought up the, you know, back in the day. The 341 01:15:50.570 --> 01:16:16.329 Jason Tilk: the industrial design community was so concerned about 3D computer modeling and how that will never be a thing. And we don't want this to happen, and CAD jockeys. And and now that's the way a designer controls the surfaces of something, and and Jimmy shows off that you can make 3D. Models instantly. The interesting thing, then, that the 342 01:16:16.330 --> 01:16:25.199 Jason Tilk: Craftsperson meets designer and me then says, these tools to sculpt are your. The tech is 343 01:16:25.250 --> 01:16:33.809 Jason Tilk: barriers lowering because of AI, and that's going to allow our students and our future designers and artists to have 344 01:16:33.870 --> 01:16:49.630 Jason Tilk: a higher degree of critical thinking associated with the objects they make, be it the quality of surface and the aesthetics of it its content, its tone. Yeah. I'm just backing Jimmy up on that. The fact that 345 01:16:49.760 --> 01:17:16.839 Jason Tilk: yeah, I don't. I don't click lassos around objects anymore. In Photoshop. I say, select select thing, and you're done. And it's so much faster. It's the equivalent of throwing a bunch of words into Chat Gpt, and saying, Make a spreadsheet. And I'm like man. I don't have to make spreadsheets anymore. It's like the this, using it as this tool to allow the creativity to happen is is like so paramount. I know I digressed a little bit, but. 346 01:17:20.360 --> 01:17:45.210 sligon: I had a couple things to say one. I don't think that AI is that good at selection masking that. Nobody needs to understand it yet, like a lot of people will be like, Oh, look! I could just select. But then it's really crappy, and there's lots of little pixelated bits or fingers missing, and things like that. So I think we're a little a little ways away from that other thing I'll say is, this is an example from the book range by Daniel 347 01:17:45.210 --> 01:17:58.579 sligon: Epstein. You guys probably remember Gary Kasparov, the chess champion, and how he got beaten by a computer finally. And he was like. Well, now, my, my phone app my phone chess app is better than I am. 348 01:17:58.870 --> 01:18:23.300 sligon: But that's not the end of the story. He actually did another match where he had access to computer information. But was the boss versus just the computer? And then he won again because he was able to get tactics right from from all of the instant calculations that a computer did. But you also be able to think outside of the box 349 01:18:23.300 --> 01:18:32.020 sligon: in a way that AI cannot yet do. So. Computer plus person was better than just 350 01:18:32.050 --> 01:18:38.139 sligon: computer, even at the things that a computer would be good at like like chess. 351 01:18:38.200 --> 01:18:47.299 sligon: So for a long time, and ideally, the technology is can be used in in 352 01:18:47.630 --> 01:18:50.698 sligon: service of and in freeing up 353 01:18:51.450 --> 01:19:03.089 sligon: people for human creativity, you know, getting the grunt, work. The analysis. Things that are that are AI is good at, but still relying on human judgment. 354 01:19:06.930 --> 01:19:07.870 Kari Weaver: Jackie. 355 01:19:13.490 --> 01:19:34.969 Jackie Mayse: Sorry. So no, this has been very interesting conversation, and it's it's fascinating like to see adobe be able to just do those things with voice prompts. It's very different than you know. When I was a photography. Did photography in college, and you know it was dark room and adobe was just in its beginning. 356 01:19:35.910 --> 01:19:44.900 Jackie Mayse: I'm very interested in AI, I think where I get a little squeamish is like to me. You were saying you were telling a student like, well, AI can just write an essay. 357 01:19:45.080 --> 01:19:54.443 Jackie Mayse: and for me I I sort of put on the brakes, and I say, yes, but you know is, is that really true? Is that get into 358 01:19:55.610 --> 01:20:01.759 Jackie Mayse: the human component is still necessary, and also the practice of writing an essay. 359 01:20:02.130 --> 01:20:13.180 Jackie Mayse: The art of the essay, the art of Research. You know, learning those research skills, learning, those works cited. I struggle to think that those are all. 360 01:20:13.250 --> 01:20:34.339 Jackie Mayse: perhaps skills that are not necessary for students to learn. I think they are necessary. And being in the library world, too, like I'm seeing in conversation, you know, students contacting libraries and saying, Hey, I got this works cited from AI. And can you give me the full text of this article? And the article doesn't exist? 361 01:20:34.640 --> 01:20:52.730 Jackie Mayse: It's it's, you know, it's a scholarly article that absolutely doesn't exist. So I think I struggle with. Sure you can have maybe an essay. But where are the sources. Where is the information coming from, and the importance of having students think critically about that, you know, think critically about information. And 362 01:20:52.730 --> 01:21:13.299 Jackie Mayse: for me, I think one big thing I try to do in classes is just, you know. Talk about the school, the tools that are available to students and the pros and the cons and the strengths of them. So because, you know, just at a basic level, I think students think, well, I could just I can Google that. But understanding. Like, you know, information. 363 01:21:13.440 --> 01:21:19.690 Jackie Mayse: It comes from the open web. It comes from library sources. More the closed web. And now, AI, so just some thoughts there. 364 01:21:21.740 --> 01:21:38.382 Kari Weaver: Yeah, thanks, Jackie. And I I appreciate that. That was part of my shrug was the essay thing I'm like, oh, no, there is like such craft to an essay itself, and, you know, need to really honor 365 01:21:38.890 --> 01:22:07.789 Kari Weaver: all of the work that goes into that and know the difference between a quality essay and not. And like you said, it's also the process of doing that right which is, there's so many skills involved in doing that. Do we need to promote those skills in the ways where we are having students write essays. I don't know right. But those are things we need to think about. I really appreciate Rachel Ferber. Your questions in chat as well. 366 01:22:08.230 --> 01:22:33.719 Kari Weaver: These are things that we've been discussing so much, you know. Skill, loss, or degradation. What's what gaps are created in turn, the point about critical thinking. And I wonder, in the we have a couple minutes left. Is there anything you wanted to kind of focus in on in that question. In this moment. 367 01:22:41.830 --> 01:23:06.199 Rachel Ferber (she/her): Hi, hello! I'm thinking about a lot of things. And I'm and I I just want to say that I really appreciate all of all of this work that you've done, and and your perspectives, and the things that you've shared, and I'm not like totally opposed to any of this. I I do see a lot of benefits, but I but I do have a lot of questions, and I think to echo what George was saying, I have a lot of 368 01:23:07.700 --> 01:23:12.450 Rachel Ferber (she/her): Just I think it's important to like, consider 369 01:23:14.260 --> 01:23:22.719 Rachel Ferber (she/her): possible negative effects of this right. And one of the things that I'm I'm thinking a lot a lot about recently is just like 370 01:23:22.860 --> 01:23:29.025 Rachel Ferber (she/her): the attention spans of our students and like and like how speed 371 01:23:29.710 --> 01:23:33.120 Rachel Ferber (she/her): And like this emphasis on efficiency. 372 01:23:33.520 --> 01:23:36.986 Rachel Ferber (she/her): it doesn't, I think, support 373 01:23:39.180 --> 01:23:56.429 Rachel Ferber (she/her): them developing a a better ability to like spend focused time on something and to and to like deal with the complexities of what it means to spend time with something right? Like, I think. I think there's a lot that is lost 374 01:23:56.550 --> 01:24:08.700 Rachel Ferber (she/her): when we like. I, you know, I, I I'm someone who like has just like a complicated relationship with technology. And I, I use it a lot in my work. But I 375 01:24:09.950 --> 01:24:12.010 Rachel Ferber (she/her): but I also think that it like. 376 01:24:13.200 --> 01:24:22.640 Rachel Ferber (she/her): you know, like, there are times when. Yeah, I want something to be quick and easy, because I don't want to spend my time doing this, but I also think that there are benefits to 377 01:24:23.320 --> 01:24:40.690 Rachel Ferber (she/her): to making something and and engaging in something that is a little bit difficult and time consuming and inefficient. Right? Like, I don't think inefficiency is a negative thing. And I think it's actually a really powerful tool to like push against 378 01:24:42.370 --> 01:24:50.784 Rachel Ferber (she/her): like power structures that perhaps are oppressive, right? And that's a larger issue. But I'm sorry. I'm just kind of rambling, but 379 01:24:51.500 --> 01:24:58.650 Rachel Ferber (she/her): I just wanted to like put out a few like a some other considerations. I suppose. 380 01:24:59.070 --> 01:25:23.280 Kari Weaver: Yeah. And I'm so grateful that that's what we're closing on. Because I think that's been a common thread that we're hearing in society, too, about the need to slow down. Take time, do things with care, and I love your comments about like wondering, creative, more expansive thinking, embracing complexity. 381 01:25:23.280 --> 01:25:46.929 Kari Weaver: We need that space. We need that time to do that. And maybe, you know, AI is something that can support that, and maybe just the frustration with it will will support our very thoughtful, careful use of that, and and engagement with ideas. Right? So thank you so much for ending on that. 382 01:25:46.930 --> 01:26:07.500 Kari Weaver: please. You know I hope to see you at other sessions. If you want to join in our team conversations or want to co-lead a session, anything, please let us know. I'll ask. And just again I appreciate the team members on this so much. Any other closing words from fellow team members. 383 01:26:08.650 --> 01:26:17.049 Kari Weaver: No, thanks so much. I look forward to talking about this more with you and other times. Okay, take care. Everyone. 384 01:26:17.410 --> 01:26:18.100 sligon: Thank you. 385 01:26:18.840 --> 01:26:19.680 Suzanne McGinness (she/her): Thanks. Kari.