Back

Eye Contact Correction: Redirecting the eyes to look at the camera

164 points1 yearsievedata.com
albert_e1 year ago

Practically --

I feel hardware technology can improve further to allow under-the-LED-display cameras .... so that we can actually look at both the camera and the screen at the same time.

(There are fingerprint sensors under mobile screens now ...and I think even some front facing cameras are being built in without sacrificing a punch hole / pixels. There is scope to make this better and seamless so we can have multiple cameras if we want behind a typical laptop screen or desktop monitor.)

This would make for a genuine look-at-the-camera video whether we are looking at other attendees in a meeting or reading off our slide notes (teleprompter style).

There would be no need to fake it.

More philosophically --

I don't quite like the normalization of AI tampering with actual videos and photos casually -- on mobile phone cameras or elsewhere. Cameras are supposed to capture reality by default. I know there is already heavy noise reduction, color correction, auto exposure etc ... but no need to use that to justify more tampering with individual facial features and expressions.

Videos are and will be used for recording humans as they are. The capturing of their genuine features and expressions should be valued more. Video should help people bond as people with as genuine body lanuage as possible. Videos will be used as memories of people bygone. Videos will be used as forensic or crime scene evidence.

Let us protect the current state of video capture. All AI enhancements should be marketed separately under a different name, not silently added into existing cameras.

jrussino1 year ago

I agree with your philosophical stance, in general, but this particular use case is one that I've been wanting for years and where I think altering the image can be in some ways more "honest" than showing the raw camera feed.

With an unfiltered camera, it looks like I'm making eye contact with you when I'm actually looking directly at my camera, and likewise it looks like I'm staring off to the side when I'm looking directly at your image in my screen.

A camera centered behind my screen might be marginally better in that regard, but it still wouldn't look quite right.

What I'd really like to see is a filter for video conferencing that is aware of the position of your image on my screen, and modifies the angle of my face and eyes to more closely match what you would actually see from that perspective (e.g. it would look like I'm making direct eye contact when I'm looking at/near the position of your eyes on my screen).

You could imagine this working even for multiple users, where I might be paying attention to one participant or another, and each of their views of me would be updated so that the one I'm paying attention to can tell I'm looking directly at them, and the others know I'm not looking directly at them in that moment.

wruza1 year ago

Would be funny if everyone on your screen gave a side eye to the bottom right corner where the currently speaking person is.

Jokes aside, I think you're absolutely right. Online interactions have dynamic geometry, so mounting a camera behind a screen will just not cut it, unless the entire screen is a camera. Also, some people might prefer projecting/receiving no eye contact at all, at times, in situations. And vice versa.

Philosophical stance here is purely traditionalist, it decides on behalf of people. What people would like to use, that should exist. "Videos are and will" is a strange claim, assuming its claimer has neither control over it nor any sort of affirmation that it is going to be true.

albert_e1 year ago

Once we have technology to put a camera under a screen without sacrificing display quality ... we will not stop at one camera.

There will be an array of cameras covering say every 2x2 inch square of your screen.

Just see how many cameras are on todays phones. Same can happen with new camera tech too.

Also there will be a huge commercial driver to put multiple cameras under the screen -- all apps and marketers can track your precise gaze. Ads will pause unless you are actually watching them. I will hate it but it feels inevitable.

+1
iwontberude1 year ago
sgerenser1 year ago

Eye tracking doesn’t require that many cameras, one or two will do. Just look at the Vision Pro, that’s its primary input mechanism.

+1
grepfru_it1 year ago
hammock1 year ago

“Eye contact” is not a monolith though. Typically we look at someone’s eyes when we are speaking but their mouth when they are speaking. And eye contact can be a pattern of crossing between their left and right eyes. And making and breaking eye contact are important parts of nonverbal communication. The typical AI “eye contact correction” will do none of this.

redwall_hp1 year ago

It's also extremely culturally dependent. (Never mind that plenty of people in countries that obsess over eye contact find it uncomfortable as well.)

It's generally considered rude or an act of intimidation to maintain eye contact with people in Japan, for example. Not nodding occasionally while someone is talking is also seen as a sign that you're not paying attention. Are we going to modify videos to nod automatically too? Or maybe we can stop trying to fake social interactions and enforcing local customs on the world.

vitorsr1 year ago

> I know there is already heavy noise reduction, color correction, auto exposure etc ... but no need to use that to justify more tampering with individual facial features and expressions.

Critically, the enumerated computational processing units are global transformations, while tampering is inherently a local, "contentful" transformation.

jMyles1 year ago

> the enumerated computational processing units are global transformations, while tampering is inherently a local, "contentful" transformation.

This is a brilliant way to examine / explain the distinction.

YeahThisIsMe1 year ago

I agree with this.

I don't actually want the person I'm talking to to appear to be looking directly into my eyes because it's weird - it means they're looking at the camera and not at me on the screen, talking to them.

mannykannot1 year ago

Indeed - intense eye contact can be unsettling, even without the additional information gleaned from knowing that the other party has chosen to look at the camera.

Eye contact is a subtle and important dynamic in human interaction (to the point where it has been suggested that we have white sclera, while our closest ape cousins do not, as an adaptation in support of easily detecting eye contact.) In a meeting, that includes third parties seeing who is making eye contact with whom.

The systems being discussed here are too simple to restore this natural dynamic, and it is not clear to me that always-on eye contact correction[1] is free of unintended and undesirable consequences - for example, in some circumstances, it might ramp up the tension in a discussion, or it might help someone who is dissembling.

[1] Even with random look-aways, I suspect - in actual conversation, look-aways are often correlated with what's going on in the discussion.

smeej1 year ago

Somehow I've apparently made a different adjustment to this than most people. My therapist was commenting on it the other day, how I do look directly into the camera when I want her to see me as making "eye contact," rather than looking directly at where I see her eyes.

She's taking this as an autistic adaptation NT people are less likely to make, like my gestures are practiced and tailored for the sake of the other, not my own sake. I want to "look in her eyes" to make a point, because that's one of the ways you show people you're making an important point, not to see how she's responding to what I'm saying.

I haven't done any of it on purpose. It's apparently just how I've adapted to the weird communication space of having a gap between actually looking at someone's eyes and being seen to be looking at someone's eyes.

Iku_Tri1 year ago

I don't want to be mean to your therapist, but really?

Understanding camera eyelines counts as autistic now?

You're fine doing that. Sorry, but that comment she made really sent me.

Reminds me of how the film department forced the digital artists to take a Cinematography and lighting classed irl so their final project renders would improve.

smeej1 year ago

It might be one thing if I had done it on purpose, because I was thinking about camera eyelines. But it wasn't deliberate. I subconsciously choose based on how another person will see me, because I don't really expect to get a whole lot of information from seeing them. Something about this being a type of "masking" in autistic women, trying harder to get my social cues across to others, but not expecting myself to receive them.

I think maybe I have "trauma masquerading as ASD," because the symptoms are subjectively improving as I learn to down-regulate my nervous system, but then I don't much care what label gets put on why I'm weird. I'm much more interested in figuring out what to do with the different ways I'm weird. I'm old enough that I can't think of ways formal diagnosis would help me, so I'd rather assume each challenge is treatable until I find out that it isn't.

boneitis1 year ago

> because it's weird

I don't get many opportunities to express my exasperation with the paradigm of the youtube content creator's thousand video cuts per spoken sentence, but hell, in the same way, I think it's just $#@%ing weird.

jpicard1 year ago

Here's a webcam that has a small arm that drops down placing the sensor in front of the screen. It blocks a little bit of the screen, but allows more eye contact, without using AI to modify eyes.

https://icontactcamera.com/

IanCal1 year ago

I get the general philosophical point but to take a fun counterargument - cameras don't record the moment. They record a very narrow snapshot in time. One you, or anyone around you, may not recognise.

Have you ever looked at a group of friends and thought "ONE OF YOU IS BLINKING"? No. Yet it's quite common to have a photo where at least one person is mid-blink. The 30-year lifespan of that photo includes the milliseconds they were blinking. Is it untrue to have a picture where two people were not blinking and standing side by side? They did in real life, in those same poses, but fractions of a second apart. Is it a failure to capture reality by having a picture of them with their eyes open? Maybe - or maybe the blending of several moments is more true to the original situation than any specific snapshot could be.

> I feel hardware technology can improve further to allow under-the-LED-display cameras .... so that we can actually look at both the camera and the screen at the same time.

That doesn't fully solve the problem because you'd be looking at the middle of the screen not at the person talking to you in a group.

> Video should help people bond as people with as genuine body lanuage as possible.

I agree, but having people be able to actually look at each other is surely part of this.

JohnFen1 year ago

> cameras don't record the moment. They record a very narrow snapshot in time.

Isn't a "moment" a very narrow snapshot in time by definition?

wizzwizz41 year ago

Colloquially, "the moment" also includes the context, both immediate and general.

albert_e1 year ago

> That doesn't fully solve the problem because you'd be looking at the middle of the screen not at the person talking to you in a group.

Repeating my comment on a sibling ...

Once we have technology to put a camera under a screen without sacrificing display quality ... we will not stop at one camera. There will be an array of cameras covering say every 2x2 inch square of your screen.

Just see how many cameras are on todays phones. Same can happen with new camera tech too.

Also there will be a huge commercial driver to put multiple cameras under the screen -- all apps and marketers can track your precise gaze. Ads will pause unless you are actually watching them. I will hate it but it feels inevitable

vlovich1231 year ago

Honestly I’ll take the software correction approach. Seems cheaper. I’ll also challenge about whether people actually care about the philosophical position about live editing. Zoom filters to ade makeup and other realtime and non realtime filters are popular. Movies have special effects. I think this purism isn’t helpful given what it seems that people actually want, not to mention that the concept of “true image” is so tenuous (eg no picture of the aurora borealis or the Milky Way is actually what your eye would see).

ballenf1 year ago

Artwork can be more true to reality than a photograph.

aitchnyu1 year ago

Will we have video with sensor signature for evidence purposes? One high court in India rejected any video evidence as a potential deepfake.

prmoustache1 year ago

> One high court in India rejected any video evidence as a potential deepfake.

Well I would have expected any court would have stopped accepting audio and video as evidence by now.

JohnFen1 year ago

All I know is that I personally have stopped giving audio and video any benefit of the doubt. I think it's risky to accept any recording as representative of truth unless you or someone you trust was there at the time of the recording to vouch for its correctness.

agos1 year ago

that would be a lot of baby throwing along the bathwater

ballenf1 year ago

Similar to how hearsay evidence is thrown out, despite it potentially having substantial value. Court is exactly the place you want to throw out the baby with the bathwater.

aspenmayer1 year ago
sharpshadow1 year ago

So one could upload the original footage directly to YouTube to get the authenticity label and put it on private then proceed with the usual edits and provide the link to the private video as prove that it’s real.

aspenmayer1 year ago

Presumably that would break the chain of custody metadata, or would leave provenance breadcrumbs leading back to the original unedited video?

https://support.google.com/youtube/answer/15446725?hl=en

> Limitations

> “Captured with a camera” only appears if a creator opts to use C2PA technology during filming. If it’s missing, it doesn’t mean the content has modified audio or visuals.

> Note: This feature is separate from our existing altered and synthetic disclosures.

> The metadata that leads to a “Captured with a camera” disclosure is made by a 3rd party (for example, a camera manufacturer). This means there is some risk that someone could take a photo of another screen showing synthetic content. Because the other screen shows an image that has been modified, it wouldn’t be eligible for the “Captured with a camera” disclosure. This issue is called “air-gapping.” Camera manufacturers will continue to develop detection measures to prevent “air-gapping,” but the sophistication of those detection measures may vary in the near term.

https://blog.google/technology/ai/google-gen-ai-content-tran...

renewiltord1 year ago

The really sad thing is that we take raw sensor data and process it at all. People are so out of touch with things these days we use lenses to focus the picture etc. Why not just transmit the raw sensor data instead of processing everything so much? People could just use their minds (I know, ridiculous to ask people to do that in this era where everything is spoon fed to you) and actually interpret things for once.

What a society! Processed food, plastics in their blood, processed sensor data. Ugh, we have strayed so far from natural interactions.

Philosophically we have abandoned being mindful of where we are, and just being our natural forms instead of being slaves to what some computer is telling you that you should be seeing.

xattt1 year ago

I always thought that under-screen cameras would come as a bug-eye lens, with the sensors between pixels. The pitch of modern mini-LED displays seems to have enough space between pixels to fit them in.

JohnFen1 year ago

> I don't quite like the normalization of AI tampering with actual videos and photos casually -- on mobile phone cameras or elsewhere.

I agree. This is one of the things that I actively worry about.

mikae11 year ago

> I feel hardware technology can improve further to allow under-the-LED-display cameras

Everything but your smartphone is big enough that you'd to sprinkle your entire screen area with sensors to get the sense of me looking at you. And, that won't be cheap.

Say my laptop had a sensor dead center and I was in a group chat. Only the person dead center would see me looking to the camera.

This is better done in software.

yieldcrv1 year ago

Or buy a specialty device for replicating the real world

Its been half a decade already from when I first noticed iphones cant capture a red world when wild fires are messing up the air quality, had to break out an ILC (DSLR without the SLR) to capture the world more congruently to how I see

lloeki1 year ago

> iphones cant capture a red world when wild fires are messing up the air quality

s/iPhones/the iPhone Camera.app/

Apps like Halide and Pro Camera have no trouble handing you over control of white balance. I've captured both faint aurora borealis and red/brown hue when sand and dust is brought over to inland Europe by scirocco with great success.

taeric1 year ago

Beam splitting is a thing. Elgato has a lowish cost one that works quite well.

TowerTall1 year ago

> under-the-LED-display cameras

If people laugh with their mouths open, wouldn’t a camera placed below the LED display capture the inside of their mouths, and the rest of the time just point straight up their noses?

albert_e1 year ago

I meant the camera will be invisible and BEHIND the screen .... just not visible as a punch hole/notch.

I think some mobile phones have already done this...where they are able to put a camera behind the pixels.

ChoHag1 year ago

[dead]

sadcherry1 year ago

[flagged]

irjustin1 year ago

The line is very long and blurry the whole way. The extremes are completely naked 100% of the time with zero grooming and the opposite is eugenics or genetic engineering body/facial features (is what i've come to believe?).

Isn't it okay to feel good about looking good, sure (i love dressing up and doing my hair for occasions)! but obviously that can turn very problematic very fast. Honestly, I wish I knew where to draw the line in the sand. Is it makeup? piercings? nice clothes? surgery?

Just a parent with two daughters who has more questions than answers.

ddingus1 year ago

What we did was draw the line at anything that might close a door in life they may prefer remain open.

Messing with hair in our youth is fun and it grows back. No worries.

Modest piercings society does not frown on. . No tattoos and especially none on the face, hands, etc...

We had boys and girls and it went OK. Not too much complaining and when they became adults, we handed them the keys and wished them well and help where and how we can.

Maybe our experiences help with understanding yours.

InDubioProRubio1 year ago

Surgery is permanent, life-long change- beauty, is relevant for 20years+

exitb1 year ago

Makeup is a personal preference. What OP talked about is subtly and transparently putting AI in a pipeline where we don't expect it. And it's not hypothetical, rather it already happens. Video meeting software is doing all kinds of sound rejection based on an unknown set of rules, even though none of us enabled that as a feature.

tomooot1 year ago

It took me a couple of years to notice the "beautify" filter on my samsung S7 as I only ever activated the screen side camera by accident. When I did eventually use it a bit, I subconsciously knew something was off but assumed it was just spec differences between the two sensors and lenses, but then I noticed the "eyeball star twinkle" and realised what was up.

On closer inspection it turns out it was actually smoothing my hair and boosting the contrast so I looked like I had dyed "highlights", along with airbrushing my cheeks a flat orangey coloured skin tone with a rosy center, as if I were wearing foundation and blusher!

beeflet1 year ago

> Video meeting software is doing all kinds of sound rejection based on an unknown set of rules, even though none of us enabled that as a feature.

It's optional on discord. Besides, it's conceivable that you might create a similar effect with a nice audio hardware setup

JohnFen1 year ago

I fundamentally agree with you, but I wanted to mention the most common reaction I've got from women when discussing this topic: women mostly don't put on makeup for the benefit of men. It has more to do with societal expectation, setting social status with other women, and very often that they like that it's a mask.

UniverseHacker1 year ago

It’s fine if you don’t want to wear makeup, but it’s not your business if other people want to. Your statement seems to assume they are doing it for your benefit specifically, or mistakenly in general, which is not the case- you are being condescending telling people not to do something but you are the one misunderstanding why they are doing it.

voidUpdate1 year ago

Guys, you don't need to modify cars ever! They're fine as they are!

sadcherry1 year ago

If guys do that to adhere to societal norms, then thats equally sad.

DidYaWipe1 year ago

Are you seriously advancing that as a valid comparison?

+1
voidUpdate1 year ago
HeatrayEnjoyer1 year ago

Please, please, tell me this is sarcasm.

master-lincoln1 year ago

I don't think it was. And I agree: make up is like putting a mask on to hide who you really are because society taught you that you are more valuable this way. People might think they do this for themselves, but it has been put into their mind by media and adverts. This is not healthy and also wasted resources.

botanical761 year ago

I partially agree with this, but at the same time... I don't feel that shunning the use of makeup and telling people their preferences are actually a result of societal brainwash is a good solution.

If the problem is that society (in bubble X, Y or Z) teaches us our value is judged solely based on our appearance, then we should address the lessons we teach. I feel it is unproductive to play whac-a-mole with the emergent symptoms of such an underlying problem.

botanical761 year ago

I think the only case where a woman's use of makeup can be considered fake is when she lies about using it.

Otherwise, it is just another way humans choose to dress their external appearance for their own pleasure, fulfilment and social intentions. It's not as if it's hard to tell when someone is wearing makeup - that is, at least when you're close enough to be able to inspect their imperfections at all.

It seems to me that this idea about makeup being 'fake' stems from heteronormative dating, where a man may feel he is unable to properly assess a woman's beauty (and her attractiveness to him) if her face has been changed in arbitrary ways. But personally, I don't think we should optimize all human encounters for dating efficiency. More broadly, there is no social contract which stipulates you must wield your natural appearance at all times. I think we need not add more social expectations to an already long list.

sadcherry1 year ago

The pure fact that there is an asymmetry between men and women w.r.t. makeup renders your argument void.

+1
botanical761 year ago
ndndjdjdn1 year ago

Next up. Stop taking showers people!

master-lincoln1 year ago

How is this a fair comparison? There are health benefits to hygiene, there are none from make-up

+2
genrilz1 year ago
hgomersall1 year ago

I'm not sure there are any established health benefits of showering routinely. Cleaning in response to contamination, sure, but every day with lots of soap etc I'm more sceptical of.

+1
wruza1 year ago
sadcherry1 year ago

Guys who equate stopping to spend 30+ minutes a day painting your face with stopping to shower are part of the problem.

It's exactly those unnatural expectations of looks that are put on women, starting at a really young age, that are the issue here. Not boys, just girls. It skews expectations and boom, everybody feels like they have to do it. It's very sad. I'm not saying don't shower, don't cut or even brush your hair, etc. All fine. But the full-on makeup you see walking through a random city in the morning, geez, what are we doing to ourselves. And what are the guys doing? Nothing close to it, but spend a lot of time justifying it.

jujube31 year ago

Sir, this is Hacker News. Nobody here takes showers.

Retr0id1 year ago

Does what it says on the tin, but honestly I find the "uncorrected" video more comfortable to watch.

ddfs1231 year ago

I think it's just that naturally nobody is keeping 100% eye contact ( except maybe like TV news reporter ), it feels like an interrogation.

XorNot1 year ago

This is what I realized is uncomfortable about camera on group meetings in teams - I can't mute other people's video, and so it feels intensely weird to have a wall of people staring blankly at you.

RheingoldRiver1 year ago

> I can't mute other people's video

you can switch to another tab, use a miniplayer, in some apps u can focus one person's screen and if you choose someone who has a static avatar up you'll barely see other people's faces.

The nuclear option is to install PowerToys [0] and put something always on top (im a fan of the hotkey winkey+space to toggle always-on-top on and off) in the exact position of the other video feeds. notepad or something.

[0] https://learn.microsoft.com/en-us/windows/powertoys/

abdullahkhalids1 year ago

This has been a feature since 2020 [1]. Similarly exists in zoom now.

[1] https://answers.microsoft.com/en-us/msteams/forum/all/featur...

prmoustache1 year ago

You can totally turn off incoming video on msteams. What you can't is have it as a default setting afaik.

mvoodarla1 year ago

Original dev here. I tend to agree for this particular demo video as I'm reading a book and I don't blink in the original.

The model tries to copy the blinks of the original video so it's possible that in other conditions, you'd notice less of this.

Fun to see this feedback though, definitely something worth improving :)

mlhpdx1 year ago

I likewise find the “corrections” uncanny. It’s not just the one with the book.

patrickhogan11 year ago

BTW your main site is throwing an error. Probably want to edit since your post is growing.

https://www.sievedata.com/

Application error: a client-side exception has occurred (see the browser console for more information).

mvoodarla1 year ago

Original dev here. Unable to replicate this on my end, try refreshing?

patrickhogan11 year ago

Interesting. The issue occurs because I have WebGL disabled, causing the createShader function you're using to throw an error. You can reproduce this by going to chrome://settings, disabling "Use hardware acceleration when available," refreshing the page, and then triggering the same error.

hanniabu1 year ago

I think it's the lack of subtle movement, it's too strict and really locks the pupils front and center

lloeki1 year ago

You mean frequent saccades?

https://en.wikipedia.org/wiki/File:This_shows_a_recording_of...

or the occasional look away? (for which there appears to be a feature for that)

> Look Away: enable_look_away helps create a more natural look by allowing the eyes to look away randomly from the camera when speaking

I expect both to be different: while saccades do happen when occasionally looking _away_ from a person, they also happen when looking _directly at_ one person because we don't constantly stare at a very specific unique and precise point on their face.

For the demo video, try enable_look_away = true, look_away_offset_max = 10, look_away_interval_min = 1 and look_away_interval_range = 1 (then submit), which from the result I got should really be the default for a more natural result.

karlgkk1 year ago

There are other implementations that do a better job, such as Apple and Google's. They also are less willing to correct eye contact when it's "out of range" so to speak.

kleiba1 year ago

It would be great to have a feature where my closed eyes are replaced with open eyes looking at the camera - then I could sleep through boring meetings.

chucksmash1 year ago

> It would be great to have a feature where my closed eyes are replaced with open eyes looking at the camera

There's already a feature that does this called HR

ClassyJacket1 year ago

It's just a matter of time until half of us are having an LLM deepfake voice clone of us attend "meetings" so we don't have to.

Arch-TK1 year ago

What is the modern obsession with video calls?

I have been working for a company which allowed full remote work without any qualms since before COVID and nobody did video calls back then. Since we end up on site in secure environments we also just get told to disable the camera in the BIOS as part of our laptop hardening.

For things like bi-annual meetings with your manager you would go into your local office.

EasyMark1 year ago

It’s bizarre, I recently had an interview where the 2 people before and myself just did voice + coderpad over teams, no video. Then the next person, also with no video requested I open up my camera, I politely asked that they do the same if were to do it and they refused, so I refused. They said the interview was over. I was like fine, have a nice day to their very rude ending of the interview. It’s strange how people expect to have all the power (or video?)in these situations. The hiring manager called me later and got my side, then apologized and asked if we could set up the 3rd interview with another team member, but I politely declined and went on about my day.

nonameiguess1 year ago

I don't remotely get it, either. Most group calls I'm in involve one person presenting something and we're watching that, not each other. For things like a one-on-one call with my manager, we just talk. Shit, sometimes we even talk on the phone, but usually still a computer. He might be wearing no pants for all I care. I usually have a cat on me and am probably laying flat. If it's before 10 AM, because my wife goes to work so late, she's probably walking around naked behind me and nobody needs to be seeing that.

My company was also 100% remote from its start, even before Covid.

As others have stated, it's also just unnerving to have people making nonstop eye contact or staring at any part of your body at all, even if it isn't your eyes. Maybe 90s Los Angeles was an abnormally shitty place to grow up because of the gang activity, but this is the kind of thing kids started fights over all the time. Robert DeNiro's most famous movie scene is about how threatening it feels to have someone looking at you.

This isn't even unique to humans. When you regularly interact with animals, you're taught to look away and not hold direct eye contact because they'll see it as a challenge or threat. I've learned to do this with my own cats to make them more at ease. You learn to blink, narrow your eyes, look to the side.

asdff1 year ago

Not only that but the obsession with higher and higher resolution video calls. Lets upgrade the laptop cams. No wait lets use a mirrorless camera and a ring light. Meanwhile the upload speed from your home isp is still early 2000s tier speeds and zoom downsamples you to 144p.

schmichael1 year ago

Also a fully remote company worker since pre-COVID, and I vastly prefer calls with the video on. It's the norm at our company. Facial expressions convey enormous amounts of information. Also our team silently claps for people which is admittedly quite silly but hey, we like it.

I'm rarely on calls without video, but when I am I find it jarring when voices just appear out of the ether with only a little flashing icon to indicate who it is I'm listening to.

To each their own!

fleischhauf1 year ago

having worked with half a remote team from Poland which also never turned the camera on I can tell you that faces provide a lot better method of person disambiguation for Germans than polish names. (no disrespect to the Polish language or people, my lack of knowledge therof is to blame here)

xnx1 year ago

Nvidia has free Broadcast software with an eye contact feature: https://www.nvidia.com/en-us/geforce/news/jan-2023-nvidia-br...

It's from January 2023, so I don't know if they've improved it further since then.

The video conferencing software providers have been way to slow to put whoever is speaking top-center (near where the camera typically is).

zamadatix1 year ago

They also release it as an SDK these days so you can use it from tools like OBS https://www.nvidia.com/en-us/geforce/broadcasting/broadcast-...

mvoodarla1 year ago

Original dev here. That's right, NVIDIA has a version available which we reference in our blog.

https://www.sievedata.com/blog/eye-contact-correction-gaze-c...

Newer models have come out that allow the same thing to be done and control even more than the eyes.

See here: https://github.com/KwaiVGI/LivePortrait/blob/main/assets/doc...

For web-conferencing, local use is great so NVIDIA's tools are what we recommend in that case.

eisenman1 year ago

I appreciated using the Nvidia Tools for remapping webcam eye-contact until I was reviewing a recording and noticed that it changed my eye color. But it’s been a bit. Perhaps an undocumented feature that newer versions/models fixed.

Der_Einzige1 year ago

Thank you very much!

- Signed, everyone whose currently trying to cheat on interviews that think that forcing peoples videos on does anything at all to keep them honest.

EasyMark1 year ago

That is so creepy. I put the app up top center just under my camera (or whatever window I’m looking at and the focuses my eyes close enough to the camera to look like I’m talking at you (on the other side). I don’t want some software interpolating my eye contact. Maybe I’m just ancient at 40 and easily creeped out by such things. The concept reminds of me of those paintings that seem to always be staring at you no matter the angle.

qwertox1 year ago

Looks somewhat creepy.

The normal thing is not to uninterruptedly look at a person (which the camera is supposed to be). For example when you make a gesture of trying to remember something by looking somewhere else.

EasyMark1 year ago

The creepy portrait that watches you no matter what part of the room you observe it from. For example the two in this jpg

https://publish.purewow.net/wp-content/uploads/sites/2/2024/...

123pie1231 year ago

it's definately in the uncanny valley for me

he turns his head a little and his eyes look wrong

DidYaWipe1 year ago

Creepy and misguided. Do people stare at you fixedly and unwaveringly during in-person conversations?

And if they do, do you like it?

lloeki1 year ago

> Look Away: enable_look_away helps create a more natural look by allowing the eyes to look away randomly from the camera when speaking

For the demo video, try enable_look_away = true, look_away_offset_max = 10, look_away_interval_min = 1 and look_away_interval_range = 1 (then submit), which from the result I got should really be the default for a more natural result.

dTal1 year ago

Okay, these options are far enough down the slippery slope to present a compelling argument that the whole thing is a Bad Idea. Short hop from here to suppress_yawns=true, and then breezing through enthusiasm_multiplier=1.4 on to enable_AI_avatar=true with bon_mot_interval=240s...

lloeki1 year ago

Did you watch the transformed video with the settings above?

I think you misunderstand the role of "Look Away": it's not like it looks completely sideways, inventing behaviour that does not exist; instead it looks "away" _from the fixed point that would be dead-on camera center_ (that results in this "I'm gonna pierce through your skull with laser eyes" look), substituting it with "when looking - not aiming/scrutinizing - at something, even continuously, human eyes have saccades"

The whole premise of such software (which has already been implemented by Apple in FaceTime with great success) is to _restore_ the reality which is "I'm looking at you but the mechanical offset between camera and window-on-screen destroys the information that I'm in fact looking at you", not invent something that is not real.

Ideally it would even:

- notice actual saccades and reproduce them, only cancelling the offset (super tough, so the next best thing is to fake it, but since these are small, uncontrolled, random-ish movements the approximation is quite sufficient)

- take into account video window position relative to the camera so that if I'm looking away from the window then it stops compensating.

But hey, first implementations are often naive. I give them credit for implementing Look Away because that's one step beyond the naive implementation. I guess it's not the default + tuneables are there because it's still early.

qwertox1 year ago

Usually looking away is part of a gesture which involves the context, like facial muscles and the information being shared ("Hmm, when was this?": makes the eyes looks up)

HKH21 year ago

If you mean someone listening is actually staring (without changing their facial expressions), then of course that's weird, but if they are adjusting their facial expressions to show reactions without looking away, what's wrong with that?

JohnFen1 year ago

> if they are adjusting their facial expressions to show reactions without looking away, what's wrong with that?

A whole lot. Even if they have varying facial expressions, not looking away is creepy as hell because looking away during conversations is actually an important aspect of the communication. Not looking away is sending a nonverbal message, and none of the usual ways that's interpreted are positive.

HKH21 year ago

Are you talking about the people listening or the person speaking? I was talking about the former, whereas you seem to be talking about the latter.

+1
JohnFen1 year ago
not_a_bot_4sho1 year ago

I've never seen an implementation of this that wasn't super creepy past the initial tech demo

Applejinx1 year ago

I put a monitor screen behind my camera for video-making, adjusted so the eyes are just barely showing over the camera. Then, when my eyes are drawn to me on the screen, I'm looking at the camera (just over it) which works pretty well. The eye contact is pretty good, but I can look away or around all I want: I'll just tend to be drawn to 'display of eyes' so I'm hacking that instinct to make better videos.

That said, plenty of people don't make eye contact with the camera much at all :)

lelandfe1 year ago

iOS Facetime quietly does this by default, I don't notice it

whaaaaat1 year ago

Let me throw out an idea I haven't seen elsewhere here yet -- this corrects for a part of who I am in a way I find uncomfortable and negative.

I'm autistic. I do not make eye contact easily. That's part of who I am, and a useful signal for the people I interact with that I might be neurodiverse.

Something like this could mask or hide that autistic trait, and make me appear allistic. And, for some autistic folks, maybe that's desirable. However, I find the traits that I have around communication to be a deep part of me. Not making eye contact is nothing to be ashamed of, because being autistic is nothing to be ashamed. The introduction of tools like this could lead to pressure to use them to conform to some standard of "normalcy" that people expect.

While the technical achievement here is neat (and other commenters pointed out places where it's struggling to look good), but there's a sort of meta impact that it can have on people who fall outside of normal ranges.

richdougherty1 year ago

Kudos to the dev for coming up with the eye position fixing solution.

Building further on this idea, I wonder if instead of changing the image to look at the camera, we could change the "camera" to be where we're looking.

In other words we could simulate a virtual camera somewhere in the screen, perhaps over the eyes of the person talking.

We could simulate a virtual camera by using the image of the real camera (or cameras), constructing a 3D image of ourselves and re-rendering it from the virtual camera location.

I think this would be really cool. It would be like there was a camera in the centre of our screen. We could stop worrying about looking at the camera and look at the person talking.

Of course this is all very tricky, but does feel possible right now. I think the Apple Vision Pro might do something similar already?

newaccount741 year ago

There is already a lot of research on the 3D reconstruction and camera movement part, for example this SIGGRAPH 2023 paper: https://research.nvidia.com/labs/nxp/lp3d/

In order for this to work for gaze correction, you'd probably need to take into consideration the location of the camera relative to the location of the eyes of the person on the screen, and then correct for how the other person is holding the phone, and it would probably only work for one-on-one calls. Probably need to know the geometry of the phone (camera parameters, screen size, position of camera relative to phone)

Would be amazing, not sure how realistic it is.

scotty791 year ago

I think you'd get a lot by just transforming eyes so the gaze is relative to the virtual camera located on the screen at the place of the face of a person you are talking to. This way you get eye contact only when you are looking on their face on the screen, but not when you look somewhere else.

mvoodarla1 year ago

This is an interesting idea. We are a little farther off from being able to do this but agree it would look really cool.

blkhawk1 year ago

During corona I build a fold down thing that put my webcam at eye-level on my monitor. turns out with a large enough monitor it really isn't that bad to have the camera in front of it.

drewbitt1 year ago

There was a Kickstarter-esque product at the time that did this too. Tiny camera that dangles down in front of the screen.

throwaway2901 year ago

I would pay for the opposite.

lemonad1 year ago

I can definitely see the use case as it is annoying having to choose between actively looking at participants of a meeting on screen or _appear_ to look at the participants by gazing into the camera and not actually looking at them.

I sometimes use an Elgato Prompter to better enable eye contact during meetings. The camera and lens is mounted behind the screen so looking at the screen is also looking at the participants. The downside is that the screen is tiny and you leaning forward to read, say, a document does not look that great on camera. So either you have to zoom it substantially or read it on another screen, thus looking away from the participants. In this case though, you are not looking at the participants and faking that eye contact in this case would be kind of weird.

asdff1 year ago

Somehow the idea of everyone looking at the camera to wave goodbye, while in the process only seeing the camera and not the people you are trying to make virtual eye contact with, is hilarious to me. Like some dystopian comedy.

onemoresoop1 year ago

This is cool but I don't find it natural and a bit creepy to keep the gaze fixed at the camera at all times. Maybe they should add a threshold of variability and allow the gaze to look away from the camera from time to time.

krosaen1 year ago

This is really cool, glad to see someone making a real product out of this - have seen demos in the past.

One thing I've always wondered is if this could be made to work for group video chats - depending on the tile you are looking at, that person would know, so you could tell who is paying attention to you, or even exchange a furtive glance with a colleague in reaction to someone else said like IRL. Even harder, but also cool would be updating the gaze dynamically so you could tell what they were looking at in your scene - say you have a whiteboard behind you and you can tell when the person is making eye contact with you vs looking at something you drew on the board.

Original dev make it so! :)

AyyEye1 year ago

Only a techbro would think that "eye contact" means just synthesizing eyes. It's a high bandwidth communication medium and synthesizing it removes what little we had. Yes I know this isn't the first, no I don't think any of this reality-meddling is any less creepy.

alexsmirnov1 year ago

Some decades ago, I worked with professional photo artists. Some did make photos for fashion magazines, one won "Word press photo" award couple times. When they make portraits, one rule is to always ask model to look slightly off the camera. As some explained, it adds some "life" to portrait. So, the tool does just the opposite to the art rules.

patrickhogan11 year ago

Really cool application.

Just a heads up – your main website is showing an error. You might want to fix it since your post is gaining traction. Here's the link: https://www.sievedata.com/

The error message reads: 'Application error: a client-side exception has occurred (check the browser console for more details).'

mvoodarla1 year ago

Original dev here. Unable to replicate this on my end, try refreshing?

patrickhogan11 year ago

Interesting. The issue occurs because I have WebGL disabled, causing the createShader function you're using to throw an error. You can reproduce this by going to chrome://settings, disabling "Use hardware acceleration when available," refreshing the page, and then triggering the same error.

Sorry for duplicate post. Also this feature is enabled by default, but causes issues with several sites.

FabianBeiner1 year ago

There is a startup called “Casablanca.ai,” which appeared on the German version of Shark Tank (“Die Höhle der Löwen”), which provides exactly this as their offering: https://www.casablanca.ai/en.

kmfrk1 year ago

We've come a long way from red eye correction.

I think it's great that this is labelled as "correction" as in a means of optional postprocessing when it's convenient. Nvidia implying that it's something we should enable by default rubs me the wrong way, but then again, I don't spend my day stuck in virtual meetings.

thekevan1 year ago

From a development standpoint, this is cool.

But the resultant video has a tad bit of uncanny valley going on.

I'd rather learn from the guy on the right.

mvoodarla1 year ago

Original dev here. Agree this video looks like uncanny valley but it's likely because the lighting of the original video is off + I baggy eyes (I was sleep deprived).

Would recommend trying it on other videos, it is surprisingly good. Although there definitely are areas to improve.

neveroddoreven1 year ago

I wonder how soon we'll just be rendering real-time AI avatars of ourselves that are traced by our facial movements. Don't have to worry about fixing your hair or lighting or wearing nice clothes; just render whatever looks most appealing with a model.

EasyMark1 year ago

Pro-tip, put something interesting near wherever your camera is and look at that. I usually put the window itself (or whatever window is most interesting) up at the top center and that will get your eyes close enough to emulate “eye-contact”

Bengalilol1 year ago

I, for some reason, prefer the original video. I may have an eye contact problem. Otherwise, the feature is nice and almost perfect: there could be some spaces where eye contact shouldn't be always on, I bet this would make it more human.

isuckatcoding1 year ago

Cool but why…?

karlgkk1 year ago

Apple does this on the iPhone, by default. When you're looking at someone's face on FaceTime, it modifies the position of your eyes to be looking directly at the camera - so the person on the other end sees you looking at them.

s4i1 year ago

Really? We FaceTime a lot with my wife and I keep telling her that I can see her looking at her own face in the corner instead or me. Is that tech accurate enough to tell that the person is looking at themselves and not the other participant, and then not correcting the eyes if that’s the case?

Anyway, I’d much prefer if Apple didn’t silently alter the eye direction of people calling me.

karlgkk1 year ago

I think their adjustment is very minor and only happens when you're looking directly at the camera. It's minor enough that you've almost certainly seen it and not noticed. She may have the setting disabled, or she may be looking far enough away from the camera that it isn't triggering.

deepfriedchokes1 year ago

Videoconferencing.

nicholasbraker1 year ago

I assumed this was a roadmapped feature on (at least) Facetime on OSX/IOS. I never saw any implementation of it, but I see value in such feature. Also for Teams etc.

theodorton1 year ago

It's available in Settings > FaceTime > Eye Contact.

nicholasbraker1 year ago

Thanx!

boiler_up8001 year ago

Looks really good and seems fast. My guess would be that this effect needs to be 99% or else people will notice something / although they may not be sure exactly what.

vintagedave1 year ago

The results here in their sample video look _really good_: other tech I’ve seen in the past looked “wrong”. But the sample input is not one I’d characterize as looking away from the screen. Eyes move around like the person is thinking. The result video only looks more focused. It’s effective in carrying focus (it really does matter when someone looks directly at you), but it’s making tiny changes.

> Limitations

> Works best with frontal face views and moderate head rotations.

> Extreme head poses or gaze directions may produce less accurate results.

There it is. To use this I’d like to see an example showing it stop adjusting when “extreme” aka normal head poses are used. If it can handle real behavior and improve eye tracking in the optimal case so it’s seamless adjusting / not as someone moves around, that would be a good product.

EdwardDiego1 year ago

My issue is the usual with laptop cameras, if I'm looking at you, my eyes are looking downwards, and it's very awkward speaking into the camera without seeing your face as I speak.

froh1 year ago

wow that's creepy :-)

technically cool, however I'd rather prefer some semi transparent mirror set up.

such a set up keeps the eyes alive.

rzzzt1 year ago

DIY Perks has a build log on such an optics-based system: https://youtu.be/2AecAXinars

advisedwang1 year ago

This is the real killer feature of Google's project starline, although they also achieve a 3D display.

jpeggtulsa1 year ago

10 cents per minute of video... Pass.

ta86451 year ago

Great. Now, do the rest of me sitting in the seat. If you don't need my real eyes, you don't need any of the real me. We can discuss, whatever it is, in email.

boomskats1 year ago

The thing with eye contact, though, is that it is worthless if you are never able to look away. When it's artificial like this, it's worse than not being there at all. It's just creepy. It was the same with nvidia's implementation a couple of years ago. It was just weird.

I do appreciate that this is a problem worth solving though, and I spent a lot of my time during COVID worrying about the negative impact that normalising loss of eye contact would have on the social interactions of our younger generations.

Back in 2021, I took one of those £50 teleprompter mirrors that YouTubers use, put a 7in raspberry pi display in the slot where you're meant to put your phone, and made it my 'work calls display' for a couple of days. The interesting thing is that the only people that noticed without me pointing it out were completely non-technical, and when they did they complemented me on the quality of my webcam rather than the fact I was looking straight at them; they could tell something was better, but couldn't quite put their finger on it. Which is funny because I'm sure being stuck behind a cheap perspex one way mirror made my actual camera quality a bit worse.

I remember I got to the point where I started playing with cv2 trying to do realtime facial landmark detection on the incoming feed and having a helper process shift the incoming video window around the little screen so that it would keep the bridge of the other person's nose (the point I naturally made eye contact with) pinned to the bit of the screen that was directly in front of the webcam lens. Then one morning I walked into my office, saw this monstrosity on my desk, realised I was nerd sniping myself and gave up.

One thing I do remember though is how odd it felt looking at yourself in a mirror without your image being mirrored. Not sure my brain was ready for that one after thousands of years of looking at itself in mirrored surfaces.

Bit of a weird pic but the only one I can find: https://pasteboard.co/BXE6zhbpOD7E.jpg

lloeki1 year ago

> One thing I do remember though is how odd it felt looking at yourself in a mirror without your image being mirrored. Not sure my brain was ready for that one after thousands of years of looking at itself in mirrored surfaces.

Feynman has a good explanation for that: https://www.youtube.com/watch?v=msN87y-iEx0

But it doesn't go deeper as to why we're perceiving ourselves that way, for that we have to dive into biology, neurology, bilateral symmetry, and the fundamentals as to how, as bilaterally symmetric beings, we're able to orient ourselves in a 3D world.

(I recall reading a paper or watching some video about that, but can't find it anymore)

blitzar1 year ago

I wanted to do this but got stuck in the rabbit hole of picking out telepromters, screens and sizes. In the end my solution was to mount my webcam in the middle of the monitor (with the other party partially obscured). Previously my technique was to look at the camera not the screen (or have the other party in a very small window at the top of my screen) so partially obscured is an improvement!

jedisct11 year ago

Doesn't FaceTime already do that?

jbverschoor1 year ago

Isn’t this built in FaceTime?

AStonesThrow1 year ago

This is unfortunate, and perhaps more pernicious than obvious deep fakes, is a video filter that lies to the recipients.

Several years ago during the pandemic, I enlisted a job coach to get me hired. One of her paramount concerns was my eye-contact with the camera. She said it's so important. Am I paying attention? Am I an honorable man who maintains eye contact when I'm in a conversation? If I look away, am I collecting my thoughts, or prevaricating?

Many supervisors, managers, and teachers will judge their employees by whether they can pay attention during meetings, or if they're distracted, in their phone's screen, looking at keyboard, glancing off at children or spouse. Even more important, if you're meeting your wife and she can't even maintain your attention, what kind of husband are you?

If you employ a gadget to lie about this, then I hope they fire you and find someone who'll be honest. I hope your wife sends you to sleep on the sofa.

function_seven1 year ago

I would go so far as to say the uncorrected gaze is a lie. When I’m on a videoconference, I am looking directly at whoever is speaking, but the camera’s physical placement tells the “lie” that I’m looking down at something else. This is because we haven’t figured out a good way of placing the camera literally wherever the eyes of the other party show up on the screen. So the camera is, by necessity, in the wrong position for video conferencing. But if we can fix it in software, then we can mitigate the “lie” somewhat.

This is especially true for my set up, where I have two screens side-by-side with the camera replaced right between them. I just stare at the camera because otherwise it looks like I’m looking way off to the left or right. If I do look at the people who are talking, what they see is me looking off at “something else.” That’s a lie! :)

AStonesThrow1 year ago

This is true, and unfortunate, but for the past 100 years, everyone has known that to make eye contact with a camera, you look into its lens. The instantaneous display of output is very recent, and if you ask a professional actress or news anchor what they do in the studio, they will tell you that they're trained to look into the camera lens, no matter what's on the monitors.

I contend that it's unproductive to train consumers otherwise. Yeah, we could look at the screen and have software correct it. Or, we may eventually integrate lenses into screens so that they're placed exactly right. But it seems kludgy to do this software fix. Just train people to look in the right place. (I hate iPhones and I'm unable/unwilling to do Facetime with them. Please use Meet or Teams.)

I'm gradually building skills that let me be aware of what's on the screen without having to stare into it. Having a relaxed, wide field of vision helps with many things. Glasses are counterproductive here.

Izkata1 year ago

News anchor yes, actor/actress no. They look off to the side.

AStonesThrow1 year ago

Ah yes, that's true - unless the character is "breaking the fourth wall" like Clarissa Darling, they'll be avoiding the direct gaze of the viewer for sure.

Another example, though, would be vocalists in a video; usually they'll be singing right at the viewer and making a connection there, unless they're just too cool and aloof.

karlgkk1 year ago

> If you employ a gadget to lie about this

This has been enabled on iPhones, by default, for like 5 years now. You never even noticed.

Their implementation only does a small adjustment, which works so well that most people don't even know it's being done.

bravetraveler1 year ago

> You never even noticed

I have seen three cameras in use in nearly a decade. They were all in interviews. I'm not avoiding opportunities, either. Legitimately 4+ hours a day

Might be fair to say not many cared to see/be seen

olyjohn1 year ago

If we never noticed it, do we even need it? I don't use FaceTime, but have never been bothered by where people are looking in any other video conferencing software.

allenu1 year ago

That reminds me of a few months into the pandemic, one of the VPs at the company I was working at was presenting in a Zoom-based all-hands. I remember that he was very clearly looking directly into the eye of the camera as opposed to looking at his monitor's video feed like everyone else. I remember thinking that it felt a little bit weird and unnatural and very performative, like a politician, since he very obviously intentionally wanted to come across as more human by looking directly at the audience, although at the same time it was a fake look since he wasn't looking directly into the eyes of any one person, but a camera.

Perhaps other people didn't think about it as deeply as I did and maybe it did have the intended effect, but I remember I didn't see him or anyone else doing the same thing in any future all-hands.

niij1 year ago

edit: studio_seven said it better than I could. You're confused on what the perspective is with videoconferencing. There is no hardware with a camera in the middle of the screen; so you're always "looking away" to some degree.

AStonesThrow1 year ago

No, I'm not confused at all. As I pointed out, the standard for 100 years: if you want eye contact, you look into the camera lens. The only thing that's changed recently is the availability of a direct, instantaneous monitor to distract us.

Furthermore, if this corrects only someone who's looking directly at the screen, it'd be tolerable. But does it also correct eyes looking at a keyboard, eyes looking at a smartphone screen, eyes looking at a wayward toddler? That's worse.

Also... ten cents per minute? That's highway robbery!

niij1 year ago

If I'm looking at a camera lens I'm not making eye contact. This isn't about broadcasting it's about videoconferences.

+1
AStonesThrow1 year ago
maximilianroos1 year ago

Sounds like the coach helped you maintain eye-contact with the camera. But if we get a tool to do this, then we're lying. Would you say the coach helped you lie?

CGamesPlay1 year ago

That doesn't even make sense. The lie is that you're not doing the thing you are projecting as doing. You just said the coach helped the poster do the thing they projected as doing.

Der_Einzige1 year ago

The fact that your feathers are rustled is what make it all the more delicious and delightful that it exists.

All attempts by folks to subvert the freedom to direct one's attention where they want to are tyrannical in nature. If you can't detect it's happening, it effectively did not have a negative externality. The tree did not make a sound if no one heard it.

This is the same thought that is used to justify not letting cashiers sit while they bag groceries. Those who think this love the taste of boots in their mouth.

I hope that they fire those who refuse to get with the times on AI and embrace ludditism, and I hope your wife considers her future with you after the economic ruin that such practices will bring upon your family.

AStonesThrow1 year ago

Really weird non-sequiturs here!

So if you enjoy freedoms like ignoring your boss or zoning out during meetings where you should be paying attention, or missing a lecture by your instructor, and you believe there aren't any negative externalities from your failure to pay attention, then I don't know what to tell you.

Now the WFH revolution is already horrifying managers, because it is much more difficult to determine when employees are engaged and productive, vs. when they're trying to fake it, or tuned out. If this AI filter wants to remove one of those cues, that's going to continue horrifying businesses everywhere, and they'll double-down on RTO calls. I've also heard horror stories of hiring remote workers, who will fake interviews, rent their identities, deepfake their video, consult AI offscreen to answer interview questions, subcontract to their illegal buddies, and generally use every trick in the book to hoodwink corporations who make the mistake of not having an in-person relationship with their workforce.

My job coach taught me the value of eye contact, and by extension, the value of paying attention to another human being who is engaged in a discussion with me. That is extremely important. In any online interaction, due to reduced cues and limited feedback, any human cue we can maintain is a valuable one.

My lack of eye contact, I believe, is mostly because it can unnerve me to have someone looking intently at me, and I look away from them in order to collect my thoughts, and maintain my train of thought. It's a habit but it's not necessarily effective. It turns out that most of us can indeed carry on a conversation, and not get distracted, when we're looking into someone's eyes.

And the value to the other party is that they know that they have our attention! That is a gift! I have no idea how tasting boots is relevant here. Every job I've had, has been a mutual gift, and a pleasure to serve my employer, and I've always felt valued for that service, despite the unequal power differential.

It is so weird that you want us to "get with the times on AI" when eye contact is such a basic, very human, and valuable habit of successful people. If AI could facilitate a human connection, I'd be all ears, but in this case, for this article, AI is subverting the signal, encouraging laziness, and simply lying, to "save face", as it were.

BeFlatXIII1 year ago

> zoning out during meetings where you should be paying attention

Also zoning out during the meetings where your presence is required but unnecessary. If you don't pay attention to a university lecture, that's a skill issue on your part.

-------

The role of the worker is to extract as much value for their employer as possible; any productivity is a secondary byproduct.

+1
AStonesThrow1 year ago